[ 499.969165] env[65631]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 500.613584] env[65680]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 502.151180] env[65680]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=65680) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 502.151559] env[65680]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=65680) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 502.151638] env[65680]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=65680) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 502.151899] env[65680]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 502.153036] env[65680]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 502.271483] env[65680]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=65680) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 502.282549] env[65680]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.011s {{(pid=65680) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 502.384817] env[65680]: INFO nova.virt.driver [None req-0e41bb6a-33da-4b07-921d-61a427671cde None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 502.457613] env[65680]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 502.457785] env[65680]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 502.457886] env[65680]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=65680) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 505.612939] env[65680]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-678946da-0180-4361-96f6-2a58ce5761d5 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 505.628208] env[65680]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=65680) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 505.628329] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-834fe3e3-7637-4aab-b81b-65e349395ca5 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 505.653742] env[65680]: INFO oslo_vmware.api [-] Successfully established new session; session ID is bbfcf. [ 505.653881] env[65680]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.196s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 505.654499] env[65680]: INFO nova.virt.vmwareapi.driver [None req-0e41bb6a-33da-4b07-921d-61a427671cde None None] VMware vCenter version: 7.0.3 [ 505.657937] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a2c71c4-b038-46b4-9188-c8955e7313bc {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 505.674867] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-364c9d97-bd5e-4bdf-a1e1-d436cc68272b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 505.680621] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-075c00ad-7837-46c5-8e5a-fd400a6f7128 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 505.687014] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c60f0437-de0d-4d80-9083-32f112493391 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 505.700781] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-481a6f1e-e491-4f13-bdf1-1249f2939c8d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 505.706405] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8563c56-da80-4077-b9b2-284e1ed3d349 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 505.735744] env[65680]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-b9db2cff-105d-44f4-84a8-6706bdcb4059 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 505.740570] env[65680]: DEBUG nova.virt.vmwareapi.driver [None req-0e41bb6a-33da-4b07-921d-61a427671cde None None] Extension org.openstack.compute already exists. {{(pid=65680) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 505.743192] env[65680]: INFO nova.compute.provider_config [None req-0e41bb6a-33da-4b07-921d-61a427671cde None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 505.761176] env[65680]: DEBUG nova.context [None req-0e41bb6a-33da-4b07-921d-61a427671cde None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),532f501f-badd-4cc0-b4cd-65aa94fb5498(cell1) {{(pid=65680) load_cells /opt/stack/nova/nova/context.py:464}} [ 505.763023] env[65680]: DEBUG oslo_concurrency.lockutils [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 505.763241] env[65680]: DEBUG oslo_concurrency.lockutils [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 505.763951] env[65680]: DEBUG oslo_concurrency.lockutils [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 505.764302] env[65680]: DEBUG oslo_concurrency.lockutils [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] Acquiring lock "532f501f-badd-4cc0-b4cd-65aa94fb5498" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 505.764488] env[65680]: DEBUG oslo_concurrency.lockutils [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] Lock "532f501f-badd-4cc0-b4cd-65aa94fb5498" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 505.765452] env[65680]: DEBUG oslo_concurrency.lockutils [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] Lock "532f501f-badd-4cc0-b4cd-65aa94fb5498" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 505.777826] env[65680]: DEBUG oslo_db.sqlalchemy.engines [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=65680) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 505.778220] env[65680]: DEBUG oslo_db.sqlalchemy.engines [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=65680) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 505.785131] env[65680]: ERROR nova.db.main.api [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 505.785131] env[65680]: result = function(*args, **kwargs) [ 505.785131] env[65680]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 505.785131] env[65680]: return func(*args, **kwargs) [ 505.785131] env[65680]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 505.785131] env[65680]: result = fn(*args, **kwargs) [ 505.785131] env[65680]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 505.785131] env[65680]: return f(*args, **kwargs) [ 505.785131] env[65680]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 505.785131] env[65680]: return db.service_get_minimum_version(context, binaries) [ 505.785131] env[65680]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 505.785131] env[65680]: _check_db_access() [ 505.785131] env[65680]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 505.785131] env[65680]: stacktrace = ''.join(traceback.format_stack()) [ 505.785131] env[65680]: [ 505.785866] env[65680]: ERROR nova.db.main.api [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 505.785866] env[65680]: result = function(*args, **kwargs) [ 505.785866] env[65680]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 505.785866] env[65680]: return func(*args, **kwargs) [ 505.785866] env[65680]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 505.785866] env[65680]: result = fn(*args, **kwargs) [ 505.785866] env[65680]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 505.785866] env[65680]: return f(*args, **kwargs) [ 505.785866] env[65680]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 505.785866] env[65680]: return db.service_get_minimum_version(context, binaries) [ 505.785866] env[65680]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 505.785866] env[65680]: _check_db_access() [ 505.785866] env[65680]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 505.785866] env[65680]: stacktrace = ''.join(traceback.format_stack()) [ 505.785866] env[65680]: [ 505.786217] env[65680]: WARNING nova.objects.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 505.786385] env[65680]: WARNING nova.objects.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] Failed to get minimum service version for cell 532f501f-badd-4cc0-b4cd-65aa94fb5498 [ 505.786795] env[65680]: DEBUG oslo_concurrency.lockutils [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] Acquiring lock "singleton_lock" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 505.786954] env[65680]: DEBUG oslo_concurrency.lockutils [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] Acquired lock "singleton_lock" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 505.787212] env[65680]: DEBUG oslo_concurrency.lockutils [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] Releasing lock "singleton_lock" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 505.787539] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] Full set of CONF: {{(pid=65680) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 505.787685] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ******************************************************************************** {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 505.787813] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] Configuration options gathered from: {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 505.787947] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 505.788152] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 505.788279] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ================================================================================ {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 505.788501] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] allow_resize_to_same_host = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.788679] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] arq_binding_timeout = 300 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.788811] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] backdoor_port = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.788936] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] backdoor_socket = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.789111] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] block_device_allocate_retries = 60 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.789273] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] block_device_allocate_retries_interval = 3 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.789444] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cert = self.pem {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.789634] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.789798] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] compute_monitors = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.789960] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] config_dir = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.790140] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] config_drive_format = iso9660 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.790274] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.790436] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] config_source = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.790600] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] console_host = devstack {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.790773] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] control_exchange = nova {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.790932] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cpu_allocation_ratio = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.791102] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] daemon = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.791268] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] debug = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.791422] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] default_access_ip_network_name = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.791583] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] default_availability_zone = nova {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.791734] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] default_ephemeral_format = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.791967] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.792143] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] default_schedule_zone = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.792298] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] disk_allocation_ratio = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.792456] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] enable_new_services = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.792631] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] enabled_apis = ['osapi_compute'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.792793] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] enabled_ssl_apis = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.792946] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] flat_injected = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.793115] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] force_config_drive = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.793272] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] force_raw_images = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.793437] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] graceful_shutdown_timeout = 5 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.793608] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] heal_instance_info_cache_interval = 60 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.793804] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] host = cpu-1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.793969] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.794140] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] initial_disk_allocation_ratio = 1.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.794297] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] initial_ram_allocation_ratio = 1.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.794520] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.794686] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] instance_build_timeout = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.794843] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] instance_delete_interval = 300 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.795009] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] instance_format = [instance: %(uuid)s] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.795177] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] instance_name_template = instance-%08x {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.795332] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] instance_usage_audit = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.795495] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] instance_usage_audit_period = month {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.795655] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.795814] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] instances_path = /opt/stack/data/nova/instances {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.795973] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] internal_service_availability_zone = internal {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.796142] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] key = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.796298] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] live_migration_retry_count = 30 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.796464] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] log_config_append = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.796651] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.796809] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] log_dir = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.796961] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] log_file = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.797099] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] log_options = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.797258] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] log_rotate_interval = 1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.797424] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] log_rotate_interval_type = days {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.797586] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] log_rotation_type = none {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.797713] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.797834] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.797997] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.798168] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.798293] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.798466] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] long_rpc_timeout = 1800 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.798636] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] max_concurrent_builds = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.798795] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] max_concurrent_live_migrations = 1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.798949] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] max_concurrent_snapshots = 5 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.799115] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] max_local_block_devices = 3 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.799268] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] max_logfile_count = 30 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.799422] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] max_logfile_size_mb = 200 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.799633] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] maximum_instance_delete_attempts = 5 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.799814] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] metadata_listen = 0.0.0.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.799982] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] metadata_listen_port = 8775 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.800163] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] metadata_workers = 2 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.800347] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] migrate_max_retries = -1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.800516] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] mkisofs_cmd = genisoimage {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.800721] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] my_block_storage_ip = 10.180.1.21 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.800851] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] my_ip = 10.180.1.21 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.801017] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] network_allocate_retries = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.801203] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.801368] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] osapi_compute_listen = 0.0.0.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.801526] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] osapi_compute_listen_port = 8774 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.801691] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] osapi_compute_unique_server_name_scope = {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.801859] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] osapi_compute_workers = 2 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.802026] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] password_length = 12 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.802190] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] periodic_enable = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.802349] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] periodic_fuzzy_delay = 60 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.802513] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] pointer_model = usbtablet {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.802679] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] preallocate_images = none {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.802835] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] publish_errors = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.802962] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] pybasedir = /opt/stack/nova {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.803128] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ram_allocation_ratio = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.803286] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] rate_limit_burst = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.803448] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] rate_limit_except_level = CRITICAL {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.803607] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] rate_limit_interval = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.803765] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] reboot_timeout = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.803921] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] reclaim_instance_interval = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.804086] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] record = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.804253] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] reimage_timeout_per_gb = 60 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.804415] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] report_interval = 120 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.804573] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] rescue_timeout = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.804733] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] reserved_host_cpus = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.804890] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] reserved_host_disk_mb = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.805058] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] reserved_host_memory_mb = 512 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.805220] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] reserved_huge_pages = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.805387] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] resize_confirm_window = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.805544] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] resize_fs_using_block_device = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.805702] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] resume_guests_state_on_host_boot = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.805867] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.806035] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] rpc_response_timeout = 60 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.806199] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] run_external_periodic_tasks = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.806364] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] running_deleted_instance_action = reap {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.806547] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] running_deleted_instance_poll_interval = 1800 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.806708] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] running_deleted_instance_timeout = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.806869] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] scheduler_instance_sync_interval = 120 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.807009] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] service_down_time = 300 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.807186] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] servicegroup_driver = db {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.807347] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] shelved_offload_time = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.807504] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] shelved_poll_interval = 3600 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.807671] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] shutdown_timeout = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.807831] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] source_is_ipv6 = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.807988] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ssl_only = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.808243] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.808411] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] sync_power_state_interval = 600 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.808642] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] sync_power_state_pool_size = 1000 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.808848] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] syslog_log_facility = LOG_USER {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.809026] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] tempdir = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.809195] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] timeout_nbd = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.809364] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] transport_url = **** {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.809551] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] update_resources_interval = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.809718] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] use_cow_images = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.809875] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] use_eventlog = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.810043] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] use_journal = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.810207] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] use_json = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.810365] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] use_rootwrap_daemon = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.810522] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] use_stderr = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.810682] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] use_syslog = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.810836] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vcpu_pin_set = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.810999] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vif_plugging_is_fatal = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.811178] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vif_plugging_timeout = 300 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.811342] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] virt_mkfs = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.811500] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] volume_usage_poll_interval = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.811658] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] watch_log_file = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.811821] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] web = /usr/share/spice-html5 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 505.812009] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_concurrency.disable_process_locking = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.812313] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.812493] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.812661] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.812832] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.813000] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.813178] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.813360] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.auth_strategy = keystone {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.813524] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.compute_link_prefix = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.813697] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.813863] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.dhcp_domain = novalocal {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.814038] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.enable_instance_password = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.814203] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.glance_link_prefix = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.814365] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.814535] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.814698] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.instance_list_per_project_cells = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.814857] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.list_records_by_skipping_down_cells = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.815025] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.local_metadata_per_cell = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.815194] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.max_limit = 1000 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.815360] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.metadata_cache_expiration = 15 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.815533] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.neutron_default_tenant_id = default {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.815696] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.use_forwarded_for = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.815856] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.use_neutron_default_nets = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.816030] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.816195] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.816358] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.816546] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.816728] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.vendordata_dynamic_targets = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.816925] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.vendordata_jsonfile_path = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.817117] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.817310] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.backend = dogpile.cache.memcached {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.817475] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.backend_argument = **** {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.817643] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.config_prefix = cache.oslo {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.817810] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.dead_timeout = 60.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.817972] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.debug_cache_backend = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.818143] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.enable_retry_client = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.818302] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.enable_socket_keepalive = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.818485] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.enabled = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.818661] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.expiration_time = 600 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.818823] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.hashclient_retry_attempts = 2 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.818983] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.hashclient_retry_delay = 1.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.819157] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.memcache_dead_retry = 300 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.819321] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.memcache_password = {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.819500] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.819674] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.819833] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.memcache_pool_maxsize = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.819990] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.820165] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.memcache_sasl_enabled = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.820344] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.820510] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.memcache_socket_timeout = 1.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.820676] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.memcache_username = {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.820840] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.proxies = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.821008] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.retry_attempts = 2 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.821178] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.retry_delay = 0.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.821338] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.socket_keepalive_count = 1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.821498] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.socket_keepalive_idle = 1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.821658] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.socket_keepalive_interval = 1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.821810] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.tls_allowed_ciphers = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.821963] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.tls_cafile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.822126] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.tls_certfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.822282] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.tls_enabled = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.822436] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cache.tls_keyfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.822603] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cinder.auth_section = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.822785] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cinder.auth_type = password {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.822953] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cinder.cafile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.823142] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cinder.catalog_info = volumev3::publicURL {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.823301] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cinder.certfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.823462] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cinder.collect_timing = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.823624] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cinder.cross_az_attach = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.823783] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cinder.debug = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.823940] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cinder.endpoint_template = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.824112] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cinder.http_retries = 3 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.824276] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cinder.insecure = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.824432] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cinder.keyfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.824600] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cinder.os_region_name = RegionOne {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.824763] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cinder.split_loggers = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.824943] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cinder.timeout = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.825150] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.825313] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] compute.cpu_dedicated_set = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.825469] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] compute.cpu_shared_set = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.825633] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] compute.image_type_exclude_list = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.825794] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.825954] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] compute.max_concurrent_disk_ops = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.826128] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] compute.max_disk_devices_to_attach = -1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.826291] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.826461] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.826622] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] compute.resource_provider_association_refresh = 300 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.826783] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] compute.shutdown_retry_interval = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.826959] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.827150] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] conductor.workers = 2 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.827327] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] console.allowed_origins = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.827485] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] console.ssl_ciphers = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.827655] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] console.ssl_minimum_version = default {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.827824] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] consoleauth.token_ttl = 600 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.827991] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cyborg.cafile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.828159] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cyborg.certfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.828318] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cyborg.collect_timing = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.828488] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cyborg.connect_retries = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.828659] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cyborg.connect_retry_delay = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.828817] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cyborg.endpoint_override = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.828977] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cyborg.insecure = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.829145] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cyborg.keyfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.829303] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cyborg.max_version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.829471] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cyborg.min_version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.829640] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cyborg.region_name = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.829796] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cyborg.service_name = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.829961] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cyborg.service_type = accelerator {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.830133] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cyborg.split_loggers = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.830293] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cyborg.status_code_retries = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.830449] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cyborg.status_code_retry_delay = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.830604] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cyborg.timeout = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.830781] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.830939] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] cyborg.version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.831138] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] database.backend = sqlalchemy {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.831315] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] database.connection = **** {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.831482] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] database.connection_debug = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.831651] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] database.connection_parameters = {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.831810] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] database.connection_recycle_time = 3600 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.831976] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] database.connection_trace = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.832148] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] database.db_inc_retry_interval = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.832308] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] database.db_max_retries = 20 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.832469] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] database.db_max_retry_interval = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.832629] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] database.db_retry_interval = 1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.832794] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] database.max_overflow = 50 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.832953] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] database.max_pool_size = 5 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.833129] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] database.max_retries = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.833287] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] database.mysql_enable_ndb = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.833454] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.833611] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] database.mysql_wsrep_sync_wait = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.833767] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] database.pool_timeout = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.833933] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] database.retry_interval = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.834097] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] database.slave_connection = **** {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.834263] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] database.sqlite_synchronous = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.834421] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] database.use_db_reconnect = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.834599] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api_database.backend = sqlalchemy {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.834774] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api_database.connection = **** {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.834939] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api_database.connection_debug = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.835120] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api_database.connection_parameters = {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.835284] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api_database.connection_recycle_time = 3600 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.835449] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api_database.connection_trace = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.835613] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api_database.db_inc_retry_interval = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.835773] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api_database.db_max_retries = 20 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.835930] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api_database.db_max_retry_interval = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.836097] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api_database.db_retry_interval = 1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.836267] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api_database.max_overflow = 50 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.836426] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api_database.max_pool_size = 5 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.836620] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api_database.max_retries = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.836794] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api_database.mysql_enable_ndb = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.838423] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.838649] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.838835] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api_database.pool_timeout = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.839023] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api_database.retry_interval = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.839194] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api_database.slave_connection = **** {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.839366] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] api_database.sqlite_synchronous = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.839569] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] devices.enabled_mdev_types = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.839757] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.839943] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ephemeral_storage_encryption.enabled = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.840134] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.840306] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.api_servers = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.840473] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.cafile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.840639] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.certfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.840805] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.collect_timing = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.840966] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.connect_retries = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.841138] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.connect_retry_delay = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.841306] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.debug = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.841473] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.default_trusted_certificate_ids = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.841638] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.enable_certificate_validation = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.841799] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.enable_rbd_download = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.841958] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.endpoint_override = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.842136] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.insecure = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.842301] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.keyfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.842460] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.max_version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.842620] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.min_version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.842783] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.num_retries = 3 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.842952] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.rbd_ceph_conf = {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.843129] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.rbd_connect_timeout = 5 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.843298] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.rbd_pool = {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.843466] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.rbd_user = {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.843627] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.region_name = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.843784] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.service_name = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.843950] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.service_type = image {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.844126] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.split_loggers = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.844286] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.status_code_retries = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.844442] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.status_code_retry_delay = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.844598] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.timeout = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.844778] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.844939] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.verify_glance_signatures = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.845108] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] glance.version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.845277] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] guestfs.debug = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.845445] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] hyperv.config_drive_cdrom = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.845609] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] hyperv.config_drive_inject_password = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.845776] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.845939] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] hyperv.enable_instance_metrics_collection = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.846114] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] hyperv.enable_remotefx = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.846284] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] hyperv.instances_path_share = {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.846453] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] hyperv.iscsi_initiator_list = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.846635] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] hyperv.limit_cpu_features = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.846801] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.846975] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.847139] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] hyperv.power_state_check_timeframe = 60 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.847300] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.847468] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.847631] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] hyperv.use_multipath_io = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.847794] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] hyperv.volume_attach_retry_count = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.847954] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.848124] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] hyperv.vswitch_name = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.848286] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.848478] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] mks.enabled = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.848831] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.849033] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] image_cache.manager_interval = 2400 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.849208] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] image_cache.precache_concurrency = 1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.849377] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] image_cache.remove_unused_base_images = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.849568] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.849746] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.849920] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] image_cache.subdirectory_name = _base {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.850106] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.api_max_retries = 60 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.850273] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.api_retry_interval = 2 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.850431] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.auth_section = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.850592] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.auth_type = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.850751] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.cafile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.850906] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.certfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.851077] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.collect_timing = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.851240] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.connect_retries = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.851397] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.connect_retry_delay = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.851555] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.endpoint_override = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.851714] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.insecure = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.851867] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.keyfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.852033] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.max_version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.852197] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.min_version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.852353] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.partition_key = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.852515] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.peer_list = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.852672] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.region_name = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.852832] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.serial_console_state_timeout = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.852988] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.service_name = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.853166] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.service_type = baremetal {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.853326] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.split_loggers = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.853482] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.status_code_retries = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.853638] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.status_code_retry_delay = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.853795] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.timeout = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.853972] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.854144] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ironic.version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.854326] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.854500] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] key_manager.fixed_key = **** {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.854684] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.854845] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican.barbican_api_version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.855014] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican.barbican_endpoint = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.855185] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican.barbican_endpoint_type = public {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.855343] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican.barbican_region_name = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.855501] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican.cafile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.855658] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican.certfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.855819] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican.collect_timing = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.855979] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican.insecure = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.856147] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican.keyfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.856306] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican.number_of_retries = 60 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.856481] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican.retry_delay = 1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.856689] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican.send_service_user_token = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.856863] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican.split_loggers = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.857067] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican.timeout = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.857217] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican.verify_ssl = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.857371] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican.verify_ssl_path = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.863376] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican_service_user.auth_section = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.863376] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican_service_user.auth_type = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.863376] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican_service_user.cafile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.863376] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican_service_user.certfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.863376] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican_service_user.collect_timing = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.863376] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican_service_user.insecure = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.863376] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican_service_user.keyfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.863813] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican_service_user.split_loggers = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.863813] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] barbican_service_user.timeout = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.863813] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vault.approle_role_id = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.863813] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vault.approle_secret_id = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.863813] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vault.cafile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.863813] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vault.certfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.863813] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vault.collect_timing = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864130] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vault.insecure = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864130] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vault.keyfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864130] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vault.kv_mountpoint = secret {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864130] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vault.kv_version = 2 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864130] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vault.namespace = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864130] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vault.root_token_id = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864130] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vault.split_loggers = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864435] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vault.ssl_ca_crt_file = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864435] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vault.timeout = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864435] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vault.use_ssl = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864435] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864435] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] keystone.cafile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864435] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] keystone.certfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864435] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] keystone.collect_timing = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864728] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] keystone.connect_retries = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864728] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] keystone.connect_retry_delay = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864728] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] keystone.endpoint_override = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864728] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] keystone.insecure = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864728] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] keystone.keyfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864728] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] keystone.max_version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864728] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] keystone.min_version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864948] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] keystone.region_name = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864948] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] keystone.service_name = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864948] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] keystone.service_type = identity {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864948] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] keystone.split_loggers = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864948] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] keystone.status_code_retries = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864948] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] keystone.status_code_retry_delay = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.864948] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] keystone.timeout = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.865144] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.865144] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] keystone.version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.865144] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.connection_uri = {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.865144] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.cpu_mode = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.865249] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.cpu_model_extra_flags = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.865399] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.cpu_models = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.865585] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.cpu_power_governor_high = performance {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.865757] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.cpu_power_governor_low = powersave {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.865919] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.cpu_power_management = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.866098] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.866267] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.device_detach_attempts = 8 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.866431] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.device_detach_timeout = 20 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.866620] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.disk_cachemodes = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.866782] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.disk_prefix = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.866942] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.enabled_perf_events = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.867163] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.file_backed_memory = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.867279] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.gid_maps = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.867435] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.hw_disk_discard = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.867589] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.hw_machine_type = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.867756] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.images_rbd_ceph_conf = {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.867920] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.868094] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.868266] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.images_rbd_glance_store_name = {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.868466] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.images_rbd_pool = rbd {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.868630] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.images_type = default {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.868794] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.images_volume_group = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.868955] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.inject_key = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.869127] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.inject_partition = -2 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.869286] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.inject_password = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.869450] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.iscsi_iface = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.869630] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.iser_use_multipath = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.869794] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.live_migration_bandwidth = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.869956] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.870127] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.live_migration_downtime = 500 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.870288] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.870448] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.870605] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.live_migration_inbound_addr = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.870763] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.870919] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.live_migration_permit_post_copy = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.871089] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.live_migration_scheme = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.871264] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.live_migration_timeout_action = abort {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.871424] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.live_migration_tunnelled = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.871581] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.live_migration_uri = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.871746] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.live_migration_with_native_tls = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.871903] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.max_queues = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.872073] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.872233] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.nfs_mount_options = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.872539] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.872739] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.872894] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.num_iser_scan_tries = 5 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.873064] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.num_memory_encrypted_guests = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.873232] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.873395] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.num_pcie_ports = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.873563] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.num_volume_scan_tries = 5 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.873729] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.pmem_namespaces = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.873889] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.quobyte_client_cfg = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.874185] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.874360] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.rbd_connect_timeout = 5 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.874525] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.874690] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.874851] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.rbd_secret_uuid = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.875014] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.rbd_user = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.875184] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.875352] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.remote_filesystem_transport = ssh {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.875509] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.rescue_image_id = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.875668] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.rescue_kernel_id = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.875824] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.rescue_ramdisk_id = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.875990] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.876163] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.rx_queue_size = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.876334] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.smbfs_mount_options = {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.876631] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.876807] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.snapshot_compression = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.876968] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.snapshot_image_format = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.877207] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.877368] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.sparse_logical_volumes = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.877529] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.swtpm_enabled = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.877699] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.swtpm_group = tss {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.877863] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.swtpm_user = tss {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.878038] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.sysinfo_serial = unique {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.878203] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.tx_queue_size = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.878367] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.uid_maps = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.878555] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.use_virtio_for_bridges = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.878738] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.virt_type = kvm {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.878910] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.volume_clear = zero {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.879084] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.volume_clear_size = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.879251] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.volume_use_multipath = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.879411] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.vzstorage_cache_path = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.879602] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.879778] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.vzstorage_mount_group = qemu {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.879945] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.vzstorage_mount_opts = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.880126] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.880405] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.880582] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.vzstorage_mount_user = stack {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.880750] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.880923] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.auth_section = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.881108] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.auth_type = password {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.881275] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.cafile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.881435] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.certfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.881597] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.collect_timing = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.881756] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.connect_retries = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.881915] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.connect_retry_delay = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.882099] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.default_floating_pool = public {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.882264] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.endpoint_override = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.882430] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.extension_sync_interval = 600 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.882593] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.http_retries = 3 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.882756] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.insecure = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.882916] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.keyfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.883088] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.max_version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.883264] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.883425] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.min_version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.883593] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.ovs_bridge = br-int {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.883762] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.physnets = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.883930] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.region_name = RegionOne {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.884109] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.service_metadata_proxy = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.884271] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.service_name = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.884437] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.service_type = network {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.884601] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.split_loggers = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.884762] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.status_code_retries = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.884922] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.status_code_retry_delay = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.885092] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.timeout = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.885276] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.885442] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] neutron.version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.885613] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] notifications.bdms_in_notifications = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.885791] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] notifications.default_level = INFO {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.885963] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] notifications.notification_format = unversioned {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.886136] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] notifications.notify_on_state_change = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.886311] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.886506] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] pci.alias = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.886690] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] pci.device_spec = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.886857] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] pci.report_in_placement = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.887038] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.auth_section = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.887215] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.auth_type = password {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.887404] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.887538] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.cafile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.887695] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.certfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.887855] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.collect_timing = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.888015] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.connect_retries = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.888178] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.connect_retry_delay = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.888334] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.default_domain_id = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.888513] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.default_domain_name = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.888687] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.domain_id = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.888845] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.domain_name = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.889109] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.endpoint_override = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.889162] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.insecure = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.889321] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.keyfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.889487] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.max_version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.889661] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.min_version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.889827] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.password = **** {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.889985] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.project_domain_id = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.890164] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.project_domain_name = Default {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.890332] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.project_id = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.890501] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.project_name = service {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.890670] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.region_name = RegionOne {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.890826] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.service_name = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.890991] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.service_type = placement {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.891165] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.split_loggers = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.891321] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.status_code_retries = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.891478] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.status_code_retry_delay = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.891634] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.system_scope = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.891790] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.timeout = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.891946] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.trust_id = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.892113] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.user_domain_id = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.892278] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.user_domain_name = Default {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.892442] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.user_id = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.892607] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.username = placement {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.892788] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.892949] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] placement.version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.893140] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] quota.cores = 20 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.893304] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] quota.count_usage_from_placement = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.893476] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.893651] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] quota.injected_file_content_bytes = 10240 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.893818] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] quota.injected_file_path_length = 255 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.893980] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] quota.injected_files = 5 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.894158] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] quota.instances = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.894322] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] quota.key_pairs = 100 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.894484] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] quota.metadata_items = 128 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.894647] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] quota.ram = 51200 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.894804] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] quota.recheck_quota = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.894970] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] quota.server_group_members = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.895145] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] quota.server_groups = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.895313] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] rdp.enabled = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.895627] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.895818] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.895987] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.896169] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] scheduler.image_metadata_prefilter = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.896333] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.896517] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] scheduler.max_attempts = 3 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.896697] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] scheduler.max_placement_results = 1000 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.896860] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.897033] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] scheduler.query_placement_for_availability_zone = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.897200] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] scheduler.query_placement_for_image_type_support = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.897361] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.897535] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] scheduler.workers = 2 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.897709] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.897879] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.898066] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.898239] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.898407] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.898596] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.898769] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.898958] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.899143] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.host_subset_size = 1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.899306] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.899484] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.899666] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.isolated_hosts = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.899832] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.isolated_images = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.899995] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.900171] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.900335] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.pci_in_placement = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.900499] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.900661] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.900824] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.900983] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.901159] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.901325] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.901489] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.track_instance_changes = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.901665] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.901832] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] metrics.required = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.901997] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] metrics.weight_multiplier = 1.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.902172] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.902335] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] metrics.weight_setting = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.902628] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.902804] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] serial_console.enabled = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.902979] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] serial_console.port_range = 10000:20000 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.903162] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.903329] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.903496] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] serial_console.serialproxy_port = 6083 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.903661] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] service_user.auth_section = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.903832] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] service_user.auth_type = password {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.903992] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] service_user.cafile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.904161] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] service_user.certfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.904321] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] service_user.collect_timing = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.904480] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] service_user.insecure = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.904634] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] service_user.keyfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.904802] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] service_user.send_service_user_token = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.904962] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] service_user.split_loggers = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.905130] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] service_user.timeout = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.905299] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] spice.agent_enabled = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.905472] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] spice.enabled = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.905763] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.905951] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.906135] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] spice.html5proxy_port = 6082 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.906295] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] spice.image_compression = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.906457] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] spice.jpeg_compression = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.906639] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] spice.playback_compression = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.906811] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] spice.server_listen = 127.0.0.1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.906980] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.907154] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] spice.streaming_mode = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.907310] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] spice.zlib_compression = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.907475] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] upgrade_levels.baseapi = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.907631] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] upgrade_levels.cert = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.907798] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] upgrade_levels.compute = auto {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.907957] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] upgrade_levels.conductor = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.908137] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] upgrade_levels.scheduler = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.908308] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vendordata_dynamic_auth.auth_section = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.908504] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vendordata_dynamic_auth.auth_type = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.908669] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vendordata_dynamic_auth.cafile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.908828] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vendordata_dynamic_auth.certfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.908991] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.909166] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vendordata_dynamic_auth.insecure = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.909323] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vendordata_dynamic_auth.keyfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.909500] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.909670] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vendordata_dynamic_auth.timeout = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.909843] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.api_retry_count = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.910008] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.ca_file = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.910188] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.cache_prefix = devstack-image-cache {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.910353] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.cluster_name = testcl1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.910517] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.connection_pool_size = 10 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.910693] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.console_delay_seconds = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.910916] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.datastore_regex = ^datastore.* {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.911143] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.911321] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.host_password = **** {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.911488] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.host_port = 443 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.911659] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.host_username = administrator@vsphere.local {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.911824] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.insecure = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.911983] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.integration_bridge = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.912159] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.maximum_objects = 100 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.912316] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.pbm_default_policy = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.912478] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.pbm_enabled = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.912635] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.pbm_wsdl_location = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.912802] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.912958] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.serial_port_proxy_uri = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.913122] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.serial_port_service_uri = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.913290] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.task_poll_interval = 0.5 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.913460] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.use_linked_clone = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.913631] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.vnc_keymap = en-us {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.913792] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.vnc_port = 5900 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.913952] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vmware.vnc_port_total = 10000 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.914150] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vnc.auth_schemes = ['none'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.914327] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vnc.enabled = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.914613] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.914797] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.914966] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vnc.novncproxy_port = 6080 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.915154] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vnc.server_listen = 127.0.0.1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.915327] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.915487] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vnc.vencrypt_ca_certs = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.915645] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vnc.vencrypt_client_cert = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.915799] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vnc.vencrypt_client_key = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.915978] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.916154] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] workarounds.disable_deep_image_inspection = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.916315] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.916489] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.916663] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.916826] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] workarounds.disable_rootwrap = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.916985] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] workarounds.enable_numa_live_migration = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.917160] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.917319] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.917478] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.917645] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] workarounds.libvirt_disable_apic = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.917798] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.917956] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.918129] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.918292] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.918454] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.918638] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.918802] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.918961] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.919131] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.919297] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.919504] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.919687] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] wsgi.client_socket_timeout = 900 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.919854] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] wsgi.default_pool_size = 1000 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.920028] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] wsgi.keep_alive = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.920199] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] wsgi.max_header_line = 16384 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.920359] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] wsgi.secure_proxy_ssl_header = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.920519] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] wsgi.ssl_ca_file = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.920678] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] wsgi.ssl_cert_file = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.920833] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] wsgi.ssl_key_file = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.920997] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] wsgi.tcp_keepidle = 600 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.921182] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.921348] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] zvm.ca_file = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.921508] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] zvm.cloud_connector_url = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.921784] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.921955] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] zvm.reachable_timeout = 300 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.922152] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_policy.enforce_new_defaults = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.922326] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_policy.enforce_scope = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.922502] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_policy.policy_default_rule = default {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.922683] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.922856] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_policy.policy_file = policy.yaml {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.923031] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.923197] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.923356] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.923511] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.923673] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.923836] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.924015] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.924198] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] profiler.connection_string = messaging:// {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.924366] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] profiler.enabled = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.924533] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] profiler.es_doc_type = notification {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.924696] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] profiler.es_scroll_size = 10000 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.924863] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] profiler.es_scroll_time = 2m {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.925034] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] profiler.filter_error_trace = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.925205] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] profiler.hmac_keys = SECRET_KEY {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.925371] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] profiler.sentinel_service_name = mymaster {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.925541] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] profiler.socket_timeout = 0.1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.925704] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] profiler.trace_sqlalchemy = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.925867] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] remote_debug.host = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.926033] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] remote_debug.port = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.926214] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.926378] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.926563] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.926738] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.926901] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.927072] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.927236] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.927396] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.927554] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.927711] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.927880] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.928063] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.928245] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.928404] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.928598] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.928782] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.928946] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.929124] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.929291] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.929465] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.929642] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.929809] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.929969] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.930142] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.930307] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.930471] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.ssl = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.930642] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.930809] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.930970] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.931153] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.931323] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_rabbit.ssl_version = {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.931511] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.931678] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_notifications.retry = -1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.931860] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.932059] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_messaging_notifications.transport_url = **** {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.932239] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.auth_section = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.932400] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.auth_type = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.932555] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.cafile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.932715] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.certfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.932868] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.collect_timing = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.933030] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.connect_retries = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.933191] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.connect_retry_delay = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.933345] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.endpoint_id = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.933498] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.endpoint_override = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.933655] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.insecure = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.933805] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.keyfile = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.933957] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.max_version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.934119] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.min_version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.934270] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.region_name = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.934421] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.service_name = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.934574] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.service_type = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.934734] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.split_loggers = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.934889] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.status_code_retries = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.935050] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.status_code_retry_delay = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.935206] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.timeout = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.935358] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.valid_interfaces = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.935509] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_limit.version = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.935671] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_reports.file_event_handler = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.935831] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.935985] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] oslo_reports.log_dir = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.936165] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.936322] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.936497] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.936678] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.936841] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.936998] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.937261] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.937441] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vif_plug_ovs_privileged.group = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.937642] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.937827] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.937993] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.938165] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] vif_plug_ovs_privileged.user = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.938368] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] os_vif_linux_bridge.flat_interface = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.938601] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.938828] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.939016] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.939194] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.939360] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.939528] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.939686] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.939862] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] os_vif_ovs.isolate_vif = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.940039] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.940207] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.940380] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.940547] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] os_vif_ovs.ovsdb_interface = native {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.940709] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] os_vif_ovs.per_port_bridge = False {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.940871] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] os_brick.lock_path = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.941042] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.941207] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] os_brick.wait_mpath_device_interval = 1 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.941371] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] privsep_osbrick.capabilities = [21] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.941528] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] privsep_osbrick.group = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.941684] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] privsep_osbrick.helper_command = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.941843] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.942009] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.942171] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] privsep_osbrick.user = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.942339] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.942495] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] nova_sys_admin.group = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.942648] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] nova_sys_admin.helper_command = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.942809] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.942966] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.943131] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] nova_sys_admin.user = None {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 505.943260] env[65680]: DEBUG oslo_service.service [None req-835d162d-a362-4cb9-8b2f-914f24bb9f39 None None] ******************************************************************************** {{(pid=65680) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 505.943685] env[65680]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 505.952420] env[65680]: INFO nova.virt.node [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Generated node identity 93ae29e4-bd04-4c19-80be-8057217cf400 [ 505.952675] env[65680]: INFO nova.virt.node [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Wrote node identity 93ae29e4-bd04-4c19-80be-8057217cf400 to /opt/stack/data/n-cpu-1/compute_id [ 505.963474] env[65680]: WARNING nova.compute.manager [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Compute nodes ['93ae29e4-bd04-4c19-80be-8057217cf400'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 505.992974] env[65680]: INFO nova.compute.manager [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 506.011102] env[65680]: WARNING nova.compute.manager [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 506.011301] env[65680]: DEBUG oslo_concurrency.lockutils [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 506.011506] env[65680]: DEBUG oslo_concurrency.lockutils [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 506.011652] env[65680]: DEBUG oslo_concurrency.lockutils [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 506.011807] env[65680]: DEBUG nova.compute.resource_tracker [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65680) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 506.012883] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6489e2e-5442-4431-b84c-c004b2ac30f9 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.021386] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed79e0e9-2a9b-4e16-bec3-2cdeff6b8687 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.034718] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2947be56-d0a8-4da2-9706-5a7ee41da81a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.040495] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-737dd85b-7792-4eb6-b587-70b24eea4187 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.068645] env[65680]: DEBUG nova.compute.resource_tracker [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181092MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65680) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 506.068799] env[65680]: DEBUG oslo_concurrency.lockutils [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 506.068970] env[65680]: DEBUG oslo_concurrency.lockutils [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 506.080287] env[65680]: WARNING nova.compute.resource_tracker [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] No compute node record for cpu-1:93ae29e4-bd04-4c19-80be-8057217cf400: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 93ae29e4-bd04-4c19-80be-8057217cf400 could not be found. [ 506.091741] env[65680]: INFO nova.compute.resource_tracker [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 93ae29e4-bd04-4c19-80be-8057217cf400 [ 506.140885] env[65680]: DEBUG nova.compute.resource_tracker [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 506.141035] env[65680]: DEBUG nova.compute.resource_tracker [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 506.241847] env[65680]: INFO nova.scheduler.client.report [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] [req-552dde57-e4f9-44ac-be17-1cfab31be9e7] Created resource provider record via placement API for resource provider with UUID 93ae29e4-bd04-4c19-80be-8057217cf400 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 506.257843] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9307edc-fcc3-4e83-ac10-0cadb2f476bc {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.265167] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0cf0191e-eb9b-453c-a1b0-ee8681db1918 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.294248] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f26b42a6-e1af-4c28-af05-2fe1dc2e36a2 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.300840] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08dbe26c-343b-44e7-af7a-a832986cf881 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.313157] env[65680]: DEBUG nova.compute.provider_tree [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Updating inventory in ProviderTree for provider 93ae29e4-bd04-4c19-80be-8057217cf400 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 506.348250] env[65680]: DEBUG nova.scheduler.client.report [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Updated inventory for provider 93ae29e4-bd04-4c19-80be-8057217cf400 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 506.348522] env[65680]: DEBUG nova.compute.provider_tree [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Updating resource provider 93ae29e4-bd04-4c19-80be-8057217cf400 generation from 0 to 1 during operation: update_inventory {{(pid=65680) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 506.348685] env[65680]: DEBUG nova.compute.provider_tree [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Updating inventory in ProviderTree for provider 93ae29e4-bd04-4c19-80be-8057217cf400 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 506.391110] env[65680]: DEBUG nova.compute.provider_tree [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Updating resource provider 93ae29e4-bd04-4c19-80be-8057217cf400 generation from 1 to 2 during operation: update_traits {{(pid=65680) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 506.407751] env[65680]: DEBUG nova.compute.resource_tracker [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65680) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 506.407910] env[65680]: DEBUG oslo_concurrency.lockutils [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.339s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 506.408082] env[65680]: DEBUG nova.service [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Creating RPC server for service compute {{(pid=65680) start /opt/stack/nova/nova/service.py:182}} [ 506.420859] env[65680]: DEBUG nova.service [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] Join ServiceGroup membership for this service compute {{(pid=65680) start /opt/stack/nova/nova/service.py:199}} [ 506.421065] env[65680]: DEBUG nova.servicegroup.drivers.db [None req-003b6186-7731-4c02-9847-83c4c6d78b4e None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=65680) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 546.164402] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Acquiring lock "eecb0c81-3810-4edd-b2da-032be990dcdc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 546.164760] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Lock "eecb0c81-3810-4edd-b2da-032be990dcdc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 546.199330] env[65680]: DEBUG nova.compute.manager [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 546.356543] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 546.358187] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 546.358467] env[65680]: INFO nova.compute.claims [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 546.504312] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3a81c41-8ca3-4588-a037-979915950173 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 546.513553] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dec3ca4d-f597-4507-83d9-2e9edf474bbb {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 546.552475] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12e4c79a-8ec0-479f-8517-dec27cc65b09 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 546.561116] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79dab9b7-a08f-41f1-9e7c-658dec31ee5a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 546.576400] env[65680]: DEBUG nova.compute.provider_tree [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 546.592186] env[65680]: DEBUG nova.scheduler.client.report [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 546.619659] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.263s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 546.620352] env[65680]: DEBUG nova.compute.manager [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 546.687251] env[65680]: DEBUG nova.compute.utils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 546.691203] env[65680]: DEBUG nova.compute.manager [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Not allocating networking since 'none' was specified. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 546.706335] env[65680]: DEBUG nova.compute.manager [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 546.805901] env[65680]: DEBUG nova.compute.manager [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 547.079744] env[65680]: DEBUG nova.virt.hardware [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 547.081158] env[65680]: DEBUG nova.virt.hardware [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 547.081158] env[65680]: DEBUG nova.virt.hardware [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 547.081158] env[65680]: DEBUG nova.virt.hardware [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 547.081158] env[65680]: DEBUG nova.virt.hardware [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 547.081158] env[65680]: DEBUG nova.virt.hardware [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 547.081581] env[65680]: DEBUG nova.virt.hardware [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 547.081581] env[65680]: DEBUG nova.virt.hardware [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 547.082539] env[65680]: DEBUG nova.virt.hardware [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 547.082539] env[65680]: DEBUG nova.virt.hardware [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 547.082539] env[65680]: DEBUG nova.virt.hardware [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 547.083100] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad9464c0-58b9-4e96-9b3e-06083633d9c5 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 547.093408] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3b76818-ba4b-4dba-ab77-52fdf72bb749 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 547.114118] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0692a97d-dfb1-4b31-9905-039c1c504306 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 547.135134] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Instance VIF info [] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 547.149260] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 547.149260] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2a6524e6-66c1-41ca-a5b7-4120d1850494 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 547.162857] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Created folder: OpenStack in parent group-v4. [ 547.162857] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Creating folder: Project (2da5b2bf0c96494f8613f57dabd7413f). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 547.162857] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d6bf085f-954d-4de0-9a66-571bad1baa25 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 547.172461] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Created folder: Project (2da5b2bf0c96494f8613f57dabd7413f) in parent group-v572532. [ 547.172461] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Creating folder: Instances. Parent ref: group-v572533. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 547.174210] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bb81fd03-6c34-4966-aa65-821728a3e5d3 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 547.184342] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Created folder: Instances in parent group-v572533. [ 547.184747] env[65680]: DEBUG oslo.service.loopingcall [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 547.185557] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 547.185557] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e36b6d43-c8f6-46da-b230-7b668598c62c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 547.205509] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 547.205509] env[65680]: value = "task-2847824" [ 547.205509] env[65680]: _type = "Task" [ 547.205509] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 547.221504] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847824, 'name': CreateVM_Task} progress is 5%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 547.730505] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847824, 'name': CreateVM_Task, 'duration_secs': 0.294577} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 547.730505] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 547.731122] env[65680]: DEBUG oslo_vmware.service [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba4ba868-3f0f-474d-86c5-314460d29342 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 547.739200] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Acquiring lock "a8b4f796-2893-4c05-be82-16a1bfd46db9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 547.739597] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Lock "a8b4f796-2893-4c05-be82-16a1bfd46db9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 547.743452] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 547.743452] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 547.744484] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 547.746458] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-320f9cb7-8331-466f-8e57-71c23236a23b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 547.749675] env[65680]: DEBUG oslo_vmware.api [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Waiting for the task: (returnval){ [ 547.749675] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]5299f6cc-da49-0b50-e35b-d775a311534b" [ 547.749675] env[65680]: _type = "Task" [ 547.749675] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 547.759096] env[65680]: DEBUG oslo_vmware.api [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]5299f6cc-da49-0b50-e35b-d775a311534b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 547.761210] env[65680]: DEBUG nova.compute.manager [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 547.814194] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 547.814479] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 547.815983] env[65680]: INFO nova.compute.claims [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 547.949063] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d2c7411-cc35-45ad-8e9b-29183e809ddb {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 547.958031] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8221305-d689-40bf-83bf-b741af544297 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 547.994321] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d06dc4a7-53bb-4c10-8023-af44389bdb50 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 548.001816] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f6d6699-b884-44a2-8024-048a895f229d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 548.021048] env[65680]: DEBUG nova.compute.provider_tree [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 548.033579] env[65680]: DEBUG nova.scheduler.client.report [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 548.052952] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.238s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 548.054442] env[65680]: DEBUG nova.compute.manager [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 548.121035] env[65680]: DEBUG nova.compute.utils [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 548.121697] env[65680]: DEBUG nova.compute.manager [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 548.121931] env[65680]: DEBUG nova.network.neutron [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 548.143394] env[65680]: DEBUG nova.compute.manager [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 548.239224] env[65680]: DEBUG nova.compute.manager [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 548.271685] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 548.271685] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 548.273336] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 548.273336] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 548.273336] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 548.273336] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f201a616-7abd-4038-84df-2c3ceb959d6d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 548.277354] env[65680]: DEBUG nova.virt.hardware [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 548.277584] env[65680]: DEBUG nova.virt.hardware [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 548.277739] env[65680]: DEBUG nova.virt.hardware [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 548.278245] env[65680]: DEBUG nova.virt.hardware [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 548.278245] env[65680]: DEBUG nova.virt.hardware [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 548.278245] env[65680]: DEBUG nova.virt.hardware [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 548.278405] env[65680]: DEBUG nova.virt.hardware [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 548.278931] env[65680]: DEBUG nova.virt.hardware [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 548.279141] env[65680]: DEBUG nova.virt.hardware [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 548.279385] env[65680]: DEBUG nova.virt.hardware [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 548.279473] env[65680]: DEBUG nova.virt.hardware [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 548.280670] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89732e6d-c065-4f24-a1e1-71df305b3369 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 548.301721] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-845d069a-c54f-4e2e-b5f5-e67cad573fc3 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 548.307180] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 548.307389] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 548.308732] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e017fd5d-0f30-4f8c-9675-285b562b608b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 548.332636] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-17d463ee-beb0-42ba-bc7e-39a407a8ac75 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 548.338832] env[65680]: DEBUG oslo_vmware.api [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Waiting for the task: (returnval){ [ 548.338832] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]523fb46c-95bc-e3e4-ab36-a125d68cddf3" [ 548.338832] env[65680]: _type = "Task" [ 548.338832] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 548.354267] env[65680]: DEBUG oslo_vmware.api [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]523fb46c-95bc-e3e4-ab36-a125d68cddf3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 548.378789] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Acquiring lock "4bea49fd-7709-4aa8-86ac-c08ee943dd73" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 548.378879] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Lock "4bea49fd-7709-4aa8-86ac-c08ee943dd73" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 548.390762] env[65680]: DEBUG nova.compute.manager [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 548.448192] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 548.448895] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 548.450520] env[65680]: INFO nova.compute.claims [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 548.584031] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5619fd0c-6bd1-4abe-849b-b7cacd5c0c0d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 548.596940] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d80891c-6ff2-4c9a-a688-56e2e3a494b4 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 548.637001] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86da184b-4360-4413-9d76-3ef7ba332961 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 548.647209] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac6fecab-0a0d-4b79-8d83-7b47abadab4d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 548.665580] env[65680]: DEBUG nova.compute.provider_tree [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 548.686594] env[65680]: DEBUG nova.scheduler.client.report [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 548.704607] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.256s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 548.705273] env[65680]: DEBUG nova.compute.manager [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 548.753150] env[65680]: DEBUG nova.compute.utils [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 548.755863] env[65680]: DEBUG nova.compute.manager [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 548.755863] env[65680]: DEBUG nova.network.neutron [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 548.772181] env[65680]: DEBUG nova.compute.manager [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 548.848194] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 548.848573] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Creating directory with path [datastore1] vmware_temp/0ff2bdc5-41b5-4947-990f-42ebb4a0eab8/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 548.848707] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c73995ea-0fed-455b-a99c-c9c58a6137f4 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 548.882344] env[65680]: DEBUG nova.policy [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '08b19df46e87419c96438881fe3bbfb4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '24062f1d9b7b4dcc83bfa71eb177f283', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 548.889379] env[65680]: DEBUG nova.compute.manager [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 548.896397] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Created directory with path [datastore1] vmware_temp/0ff2bdc5-41b5-4947-990f-42ebb4a0eab8/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 548.899020] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Fetch image to [datastore1] vmware_temp/0ff2bdc5-41b5-4947-990f-42ebb4a0eab8/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 548.899020] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/0ff2bdc5-41b5-4947-990f-42ebb4a0eab8/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 548.899020] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94224fca-54b3-4238-8b3c-a5aff41526c6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 548.906704] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33d91ea3-5f4a-45d3-888f-be1ea4c3fd49 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 548.927544] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a35f71e-d545-4507-91ce-a742bcbc71e4 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 548.963448] env[65680]: DEBUG nova.virt.hardware [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 548.963610] env[65680]: DEBUG nova.virt.hardware [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 548.963755] env[65680]: DEBUG nova.virt.hardware [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 548.963924] env[65680]: DEBUG nova.virt.hardware [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 548.964072] env[65680]: DEBUG nova.virt.hardware [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 548.964212] env[65680]: DEBUG nova.virt.hardware [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 548.964437] env[65680]: DEBUG nova.virt.hardware [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 548.964613] env[65680]: DEBUG nova.virt.hardware [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 548.964777] env[65680]: DEBUG nova.virt.hardware [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 548.964984] env[65680]: DEBUG nova.virt.hardware [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 548.965151] env[65680]: DEBUG nova.virt.hardware [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 548.965945] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c20e02c-5843-4b46-9932-15b34cf35669 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 548.969156] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d918c84-f45d-4bfe-ac5b-75ebaf7ff06c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 548.977450] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a7866544-e4d1-49c9-aecf-7f7543b57d93 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 548.980569] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cae640c8-c3d9-4b1b-8194-651ea22caadf {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.070033] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 549.147193] env[65680]: DEBUG oslo_vmware.rw_handles [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0ff2bdc5-41b5-4947-990f-42ebb4a0eab8/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 549.217691] env[65680]: DEBUG nova.policy [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '16f5ae5429ad497aac55a43dddad9188', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4aed8af4274b4769bcba3f358a2eb421', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 549.221366] env[65680]: DEBUG oslo_vmware.rw_handles [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Completed reading data from the image iterator. {{(pid=65680) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 549.221578] env[65680]: DEBUG oslo_vmware.rw_handles [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0ff2bdc5-41b5-4947-990f-42ebb4a0eab8/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 549.615966] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Acquiring lock "d98c190b-7d45-4e74-909d-75b38bfc6554" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 549.615966] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Lock "d98c190b-7d45-4e74-909d-75b38bfc6554" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 549.628146] env[65680]: DEBUG nova.compute.manager [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 549.689348] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 549.689348] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 549.693101] env[65680]: INFO nova.compute.claims [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 549.858081] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d047ff32-064c-4c71-aaa3-6ed42adda7c7 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.866584] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cc6e2fe-698c-46f8-b916-72bc0ed7650c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.908988] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6f55b20-a49e-4021-a48c-1552789888e6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.919303] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d3a2597-8a52-4da5-a6c8-d3784239e7de {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.934586] env[65680]: DEBUG nova.compute.provider_tree [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 549.946251] env[65680]: DEBUG nova.scheduler.client.report [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 549.966606] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.278s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 549.967241] env[65680]: DEBUG nova.compute.manager [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 550.019854] env[65680]: DEBUG nova.compute.utils [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 550.021779] env[65680]: DEBUG nova.compute.manager [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 550.021779] env[65680]: DEBUG nova.network.neutron [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 550.033795] env[65680]: DEBUG nova.compute.manager [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 550.110633] env[65680]: DEBUG nova.compute.manager [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 550.137142] env[65680]: DEBUG nova.virt.hardware [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 550.137142] env[65680]: DEBUG nova.virt.hardware [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 550.137142] env[65680]: DEBUG nova.virt.hardware [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 550.137392] env[65680]: DEBUG nova.virt.hardware [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 550.137558] env[65680]: DEBUG nova.virt.hardware [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 550.137674] env[65680]: DEBUG nova.virt.hardware [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 550.137981] env[65680]: DEBUG nova.virt.hardware [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 550.138439] env[65680]: DEBUG nova.virt.hardware [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 550.138522] env[65680]: DEBUG nova.virt.hardware [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 550.138702] env[65680]: DEBUG nova.virt.hardware [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 550.139073] env[65680]: DEBUG nova.virt.hardware [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 550.140241] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6eca932e-0cf6-4cd4-a650-d48a79aa7cbb {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 550.152194] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-040280c3-7dc4-4fbe-987b-74370e06cb1b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 550.351029] env[65680]: DEBUG nova.policy [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5081899e6bd446e09db97d972eeefa21', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2b48a41a6b634f9fa85b86451056713e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 550.968215] env[65680]: DEBUG nova.network.neutron [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Successfully created port: 332a8d1b-b53c-4a60-a737-eaa3a81d494b {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 551.343968] env[65680]: DEBUG nova.network.neutron [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Successfully created port: 0845dcf6-d5f1-44e3-ad80-12eeb1087407 {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 552.261115] env[65680]: DEBUG nova.network.neutron [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Successfully created port: 50897877-7974-45f4-be58-52d3c88d26c1 {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 553.440171] env[65680]: DEBUG nova.network.neutron [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Successfully updated port: 0845dcf6-d5f1-44e3-ad80-12eeb1087407 {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 553.455177] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Acquiring lock "refresh_cache-a8b4f796-2893-4c05-be82-16a1bfd46db9" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 553.455177] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Acquired lock "refresh_cache-a8b4f796-2893-4c05-be82-16a1bfd46db9" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 553.455177] env[65680]: DEBUG nova.network.neutron [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 553.762453] env[65680]: DEBUG nova.network.neutron [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 554.264650] env[65680]: DEBUG nova.network.neutron [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Successfully updated port: 332a8d1b-b53c-4a60-a737-eaa3a81d494b {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 554.275293] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Acquiring lock "refresh_cache-4bea49fd-7709-4aa8-86ac-c08ee943dd73" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 554.275293] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Acquired lock "refresh_cache-4bea49fd-7709-4aa8-86ac-c08ee943dd73" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 554.275293] env[65680]: DEBUG nova.network.neutron [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 554.567213] env[65680]: DEBUG nova.network.neutron [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 554.987020] env[65680]: DEBUG nova.network.neutron [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Updating instance_info_cache with network_info: [{"id": "0845dcf6-d5f1-44e3-ad80-12eeb1087407", "address": "fa:16:3e:00:35:71", "network": {"id": "a9d1bb29-b5b6-4def-ba99-47d740f2c8b7", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.128", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "37ed348cd06e407e8b18e9a9365b037b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c02dd284-ab80-451c-93eb-48c8360acb9c", "external-id": "nsx-vlan-transportzone-818", "segmentation_id": 818, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0845dcf6-d5", "ovs_interfaceid": "0845dcf6-d5f1-44e3-ad80-12eeb1087407", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 555.005015] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Releasing lock "refresh_cache-a8b4f796-2893-4c05-be82-16a1bfd46db9" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 555.005310] env[65680]: DEBUG nova.compute.manager [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Instance network_info: |[{"id": "0845dcf6-d5f1-44e3-ad80-12eeb1087407", "address": "fa:16:3e:00:35:71", "network": {"id": "a9d1bb29-b5b6-4def-ba99-47d740f2c8b7", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.128", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "37ed348cd06e407e8b18e9a9365b037b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c02dd284-ab80-451c-93eb-48c8360acb9c", "external-id": "nsx-vlan-transportzone-818", "segmentation_id": 818, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0845dcf6-d5", "ovs_interfaceid": "0845dcf6-d5f1-44e3-ad80-12eeb1087407", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 555.005758] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:00:35:71', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c02dd284-ab80-451c-93eb-48c8360acb9c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0845dcf6-d5f1-44e3-ad80-12eeb1087407', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 555.021528] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Creating folder: Project (24062f1d9b7b4dcc83bfa71eb177f283). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 555.028977] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-46ffdea9-a41b-4fec-a879-cb6d7d9768b0 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.043392] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Created folder: Project (24062f1d9b7b4dcc83bfa71eb177f283) in parent group-v572532. [ 555.043622] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Creating folder: Instances. Parent ref: group-v572536. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 555.043814] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-463f6828-37e4-4c7d-9423-66d996324851 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.052528] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Created folder: Instances in parent group-v572536. [ 555.052753] env[65680]: DEBUG oslo.service.loopingcall [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 555.053045] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 555.053146] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a8fef424-e672-4e53-bbaf-9a341b122a69 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.080860] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 555.080860] env[65680]: value = "task-2847827" [ 555.080860] env[65680]: _type = "Task" [ 555.080860] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 555.088608] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847827, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 555.111925] env[65680]: DEBUG nova.network.neutron [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Successfully updated port: 50897877-7974-45f4-be58-52d3c88d26c1 {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 555.125894] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Acquiring lock "refresh_cache-d98c190b-7d45-4e74-909d-75b38bfc6554" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 555.125894] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Acquired lock "refresh_cache-d98c190b-7d45-4e74-909d-75b38bfc6554" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 555.125894] env[65680]: DEBUG nova.network.neutron [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 555.222862] env[65680]: DEBUG nova.network.neutron [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 555.241154] env[65680]: DEBUG nova.network.neutron [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Updating instance_info_cache with network_info: [{"id": "332a8d1b-b53c-4a60-a737-eaa3a81d494b", "address": "fa:16:3e:a2:8c:99", "network": {"id": "a9d1bb29-b5b6-4def-ba99-47d740f2c8b7", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.156", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "37ed348cd06e407e8b18e9a9365b037b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c02dd284-ab80-451c-93eb-48c8360acb9c", "external-id": "nsx-vlan-transportzone-818", "segmentation_id": 818, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap332a8d1b-b5", "ovs_interfaceid": "332a8d1b-b53c-4a60-a737-eaa3a81d494b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 555.262408] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Releasing lock "refresh_cache-4bea49fd-7709-4aa8-86ac-c08ee943dd73" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 555.262408] env[65680]: DEBUG nova.compute.manager [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Instance network_info: |[{"id": "332a8d1b-b53c-4a60-a737-eaa3a81d494b", "address": "fa:16:3e:a2:8c:99", "network": {"id": "a9d1bb29-b5b6-4def-ba99-47d740f2c8b7", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.156", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "37ed348cd06e407e8b18e9a9365b037b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c02dd284-ab80-451c-93eb-48c8360acb9c", "external-id": "nsx-vlan-transportzone-818", "segmentation_id": 818, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap332a8d1b-b5", "ovs_interfaceid": "332a8d1b-b53c-4a60-a737-eaa3a81d494b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 555.262602] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a2:8c:99', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c02dd284-ab80-451c-93eb-48c8360acb9c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '332a8d1b-b53c-4a60-a737-eaa3a81d494b', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 555.273224] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Creating folder: Project (4aed8af4274b4769bcba3f358a2eb421). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 555.273863] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c1044581-0078-4ad6-93e8-5a106b38609e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.285270] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Created folder: Project (4aed8af4274b4769bcba3f358a2eb421) in parent group-v572532. [ 555.285787] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Creating folder: Instances. Parent ref: group-v572539. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 555.285917] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7eb3ab68-9447-4329-afdf-ada26c50bb3b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.300894] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Created folder: Instances in parent group-v572539. [ 555.300894] env[65680]: DEBUG oslo.service.loopingcall [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 555.300894] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 555.300894] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d41f41ec-e32d-429b-9b6d-7911e6ef28e4 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.326061] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 555.326061] env[65680]: value = "task-2847830" [ 555.326061] env[65680]: _type = "Task" [ 555.326061] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 555.336734] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847830, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 555.593951] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847827, 'name': CreateVM_Task, 'duration_secs': 0.339158} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 555.593951] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 555.787545] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 555.787545] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 555.787545] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 555.787545] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-327449fd-4b9d-482c-b23a-d7efc3a9db01 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.796661] env[65680]: DEBUG oslo_vmware.api [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Waiting for the task: (returnval){ [ 555.796661] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]522f55d9-506f-958e-1449-5114d66a3558" [ 555.796661] env[65680]: _type = "Task" [ 555.796661] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 555.805891] env[65680]: DEBUG oslo_vmware.api [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]522f55d9-506f-958e-1449-5114d66a3558, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 555.837392] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847830, 'name': CreateVM_Task} progress is 99%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 555.893751] env[65680]: DEBUG nova.network.neutron [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Updating instance_info_cache with network_info: [{"id": "50897877-7974-45f4-be58-52d3c88d26c1", "address": "fa:16:3e:a5:46:8d", "network": {"id": "befe768f-b79e-4bbf-8bb5-1f80b78e45db", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1138998705-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2b48a41a6b634f9fa85b86451056713e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap50897877-79", "ovs_interfaceid": "50897877-7974-45f4-be58-52d3c88d26c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 555.915324] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Releasing lock "refresh_cache-d98c190b-7d45-4e74-909d-75b38bfc6554" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 555.915719] env[65680]: DEBUG nova.compute.manager [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Instance network_info: |[{"id": "50897877-7974-45f4-be58-52d3c88d26c1", "address": "fa:16:3e:a5:46:8d", "network": {"id": "befe768f-b79e-4bbf-8bb5-1f80b78e45db", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1138998705-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2b48a41a6b634f9fa85b86451056713e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap50897877-79", "ovs_interfaceid": "50897877-7974-45f4-be58-52d3c88d26c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 555.916861] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a5:46:8d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '92f3cfd6-c130-4390-8910-865fbc42afd1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '50897877-7974-45f4-be58-52d3c88d26c1', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 555.924641] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Creating folder: Project (2b48a41a6b634f9fa85b86451056713e). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 555.924641] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ecb5fc0f-eac7-46d7-b8f4-513b09f55b5c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.936569] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Created folder: Project (2b48a41a6b634f9fa85b86451056713e) in parent group-v572532. [ 555.936776] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Creating folder: Instances. Parent ref: group-v572542. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 555.937112] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c79badd9-61b1-49c4-8430-4a51a611ae59 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.946272] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Created folder: Instances in parent group-v572542. [ 555.946513] env[65680]: DEBUG oslo.service.loopingcall [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 555.946699] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 555.946900] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7c142108-c5e8-41f3-83a5-9ba85bba434f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.968906] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 555.968906] env[65680]: value = "task-2847833" [ 555.968906] env[65680]: _type = "Task" [ 555.968906] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 555.977601] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847833, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 556.301709] env[65680]: DEBUG oslo_concurrency.lockutils [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Acquiring lock "53d9fa9e-f645-47a7-9990-4ad3bfb4ca45" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 556.301905] env[65680]: DEBUG oslo_concurrency.lockutils [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Lock "53d9fa9e-f645-47a7-9990-4ad3bfb4ca45" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 556.309812] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 556.309812] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 556.310129] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 556.318407] env[65680]: DEBUG nova.compute.manager [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 556.339399] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847830, 'name': CreateVM_Task, 'duration_secs': 0.812829} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 556.339615] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 556.340274] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 556.340429] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 556.340735] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 556.341263] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-da09f57c-2b00-47d6-9c39-80c206ca90ec {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 556.346254] env[65680]: DEBUG oslo_vmware.api [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Waiting for the task: (returnval){ [ 556.346254] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52193a1f-3aca-51d0-64ee-8cb2f5122bbc" [ 556.346254] env[65680]: _type = "Task" [ 556.346254] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 556.355505] env[65680]: DEBUG oslo_vmware.api [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52193a1f-3aca-51d0-64ee-8cb2f5122bbc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 556.378069] env[65680]: DEBUG oslo_concurrency.lockutils [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 556.378069] env[65680]: DEBUG oslo_concurrency.lockutils [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 556.380136] env[65680]: INFO nova.compute.claims [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 556.484748] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847833, 'name': CreateVM_Task, 'duration_secs': 0.325012} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 556.484925] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 556.485589] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 556.543069] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-677e0aac-7d87-4457-89e7-fef25594ca5f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 556.551115] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd4907c7-985a-4fe9-b435-311bcaf4ae1f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 556.588914] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90efeab2-9aae-4d2a-8e86-4f8dc60e0aa6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 556.597878] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42bcd8ea-780d-4c20-a7a2-a93cbb3e219f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 556.612921] env[65680]: DEBUG nova.compute.provider_tree [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 556.622121] env[65680]: DEBUG nova.scheduler.client.report [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 556.638479] env[65680]: DEBUG oslo_concurrency.lockutils [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.261s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 556.639200] env[65680]: DEBUG nova.compute.manager [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 556.680579] env[65680]: DEBUG nova.compute.utils [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 556.681969] env[65680]: DEBUG nova.compute.manager [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 556.682590] env[65680]: DEBUG nova.network.neutron [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 556.703682] env[65680]: DEBUG nova.compute.manager [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 556.785919] env[65680]: DEBUG nova.policy [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3c16303644f3420eb6041e8b76b7a823', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9a6709b4017c4ee086e003690484af4e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 556.796843] env[65680]: DEBUG nova.compute.manager [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 556.832779] env[65680]: DEBUG nova.virt.hardware [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 556.832966] env[65680]: DEBUG nova.virt.hardware [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 556.833139] env[65680]: DEBUG nova.virt.hardware [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 556.833428] env[65680]: DEBUG nova.virt.hardware [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 556.833632] env[65680]: DEBUG nova.virt.hardware [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 556.833783] env[65680]: DEBUG nova.virt.hardware [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 556.834227] env[65680]: DEBUG nova.virt.hardware [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 556.834227] env[65680]: DEBUG nova.virt.hardware [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 556.834358] env[65680]: DEBUG nova.virt.hardware [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 556.834467] env[65680]: DEBUG nova.virt.hardware [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 556.834636] env[65680]: DEBUG nova.virt.hardware [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 556.836419] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87ee574b-33ca-476d-a2af-6ad31d86791b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 556.844852] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ca2d3f2-0dd0-473d-8c04-b31f62e950b1 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 556.866670] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 556.870260] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 556.870260] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 556.870260] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 556.870260] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 556.870395] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-62f2e9cf-5918-473a-85d3-59a1825eee2d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 556.873867] env[65680]: DEBUG oslo_vmware.api [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Waiting for the task: (returnval){ [ 556.873867] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52a120c2-98e7-760b-35e9-f79611cb36ec" [ 556.873867] env[65680]: _type = "Task" [ 556.873867] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 556.885174] env[65680]: DEBUG oslo_vmware.api [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52a120c2-98e7-760b-35e9-f79611cb36ec, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 556.908586] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Acquiring lock "059f5688-3497-40bd-bf18-9c0748f3bdd6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 556.908816] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Lock "059f5688-3497-40bd-bf18-9c0748f3bdd6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 556.924360] env[65680]: DEBUG nova.compute.manager [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 556.940100] env[65680]: DEBUG nova.compute.manager [req-63fd11bd-ffd7-43c8-adda-37d9899514fa req-38fc068a-7198-4ad8-a6ac-b6ef2eec49d6 service nova] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Received event network-vif-plugged-0845dcf6-d5f1-44e3-ad80-12eeb1087407 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 556.940100] env[65680]: DEBUG oslo_concurrency.lockutils [req-63fd11bd-ffd7-43c8-adda-37d9899514fa req-38fc068a-7198-4ad8-a6ac-b6ef2eec49d6 service nova] Acquiring lock "a8b4f796-2893-4c05-be82-16a1bfd46db9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 556.940309] env[65680]: DEBUG oslo_concurrency.lockutils [req-63fd11bd-ffd7-43c8-adda-37d9899514fa req-38fc068a-7198-4ad8-a6ac-b6ef2eec49d6 service nova] Lock "a8b4f796-2893-4c05-be82-16a1bfd46db9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 556.940474] env[65680]: DEBUG oslo_concurrency.lockutils [req-63fd11bd-ffd7-43c8-adda-37d9899514fa req-38fc068a-7198-4ad8-a6ac-b6ef2eec49d6 service nova] Lock "a8b4f796-2893-4c05-be82-16a1bfd46db9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 556.940635] env[65680]: DEBUG nova.compute.manager [req-63fd11bd-ffd7-43c8-adda-37d9899514fa req-38fc068a-7198-4ad8-a6ac-b6ef2eec49d6 service nova] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] No waiting events found dispatching network-vif-plugged-0845dcf6-d5f1-44e3-ad80-12eeb1087407 {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 556.940796] env[65680]: WARNING nova.compute.manager [req-63fd11bd-ffd7-43c8-adda-37d9899514fa req-38fc068a-7198-4ad8-a6ac-b6ef2eec49d6 service nova] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Received unexpected event network-vif-plugged-0845dcf6-d5f1-44e3-ad80-12eeb1087407 for instance with vm_state building and task_state spawning. [ 556.981899] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 556.982294] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 556.983708] env[65680]: INFO nova.compute.claims [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 557.157795] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94128c05-785f-43cc-b2fd-e7f360194de5 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.165909] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b9e4ec9-e805-422d-b26e-733509480d95 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.200165] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cf016d8-d601-4616-8648-ac5784aab46c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.208335] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6ea4650-9452-4a6a-a319-e755d2d5ee75 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.224175] env[65680]: DEBUG nova.compute.provider_tree [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 557.233502] env[65680]: DEBUG nova.scheduler.client.report [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 557.251732] env[65680]: DEBUG nova.network.neutron [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Successfully created port: c6374e08-c390-4e13-9080-80a7c5dd3dc9 {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 557.255411] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.273s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 557.255873] env[65680]: DEBUG nova.compute.manager [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 557.293360] env[65680]: DEBUG nova.compute.utils [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 557.295078] env[65680]: DEBUG nova.compute.manager [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 557.295293] env[65680]: DEBUG nova.network.neutron [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 557.310990] env[65680]: DEBUG nova.compute.manager [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 557.390555] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 557.391039] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 557.392178] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 557.404878] env[65680]: DEBUG nova.compute.manager [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 557.423144] env[65680]: DEBUG nova.policy [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6e930ca6330e4f1f81e28e8a9b336af2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0f8403f4435a4c8c872696e896f983ae', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 557.435371] env[65680]: DEBUG nova.virt.hardware [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 557.435602] env[65680]: DEBUG nova.virt.hardware [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 557.435757] env[65680]: DEBUG nova.virt.hardware [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 557.435934] env[65680]: DEBUG nova.virt.hardware [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 557.436090] env[65680]: DEBUG nova.virt.hardware [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 557.436367] env[65680]: DEBUG nova.virt.hardware [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 557.436440] env[65680]: DEBUG nova.virt.hardware [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 557.436587] env[65680]: DEBUG nova.virt.hardware [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 557.436745] env[65680]: DEBUG nova.virt.hardware [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 557.436903] env[65680]: DEBUG nova.virt.hardware [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 557.440444] env[65680]: DEBUG nova.virt.hardware [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 557.441138] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57ab6b64-aecd-4304-9dc7-8ed9d7aa5f48 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.449565] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f8d6672-219a-4751-a27f-a8d62e9ae881 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.061662] env[65680]: DEBUG nova.compute.manager [req-097541d7-6086-4da0-a921-f7431a56ecaf req-22a4e7cd-2ec3-4d50-b68e-f09fb6210a78 service nova] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Received event network-vif-plugged-332a8d1b-b53c-4a60-a737-eaa3a81d494b {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 558.061948] env[65680]: DEBUG oslo_concurrency.lockutils [req-097541d7-6086-4da0-a921-f7431a56ecaf req-22a4e7cd-2ec3-4d50-b68e-f09fb6210a78 service nova] Acquiring lock "4bea49fd-7709-4aa8-86ac-c08ee943dd73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 558.061948] env[65680]: DEBUG oslo_concurrency.lockutils [req-097541d7-6086-4da0-a921-f7431a56ecaf req-22a4e7cd-2ec3-4d50-b68e-f09fb6210a78 service nova] Lock "4bea49fd-7709-4aa8-86ac-c08ee943dd73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 558.062077] env[65680]: DEBUG oslo_concurrency.lockutils [req-097541d7-6086-4da0-a921-f7431a56ecaf req-22a4e7cd-2ec3-4d50-b68e-f09fb6210a78 service nova] Lock "4bea49fd-7709-4aa8-86ac-c08ee943dd73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 558.062246] env[65680]: DEBUG nova.compute.manager [req-097541d7-6086-4da0-a921-f7431a56ecaf req-22a4e7cd-2ec3-4d50-b68e-f09fb6210a78 service nova] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] No waiting events found dispatching network-vif-plugged-332a8d1b-b53c-4a60-a737-eaa3a81d494b {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 558.062405] env[65680]: WARNING nova.compute.manager [req-097541d7-6086-4da0-a921-f7431a56ecaf req-22a4e7cd-2ec3-4d50-b68e-f09fb6210a78 service nova] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Received unexpected event network-vif-plugged-332a8d1b-b53c-4a60-a737-eaa3a81d494b for instance with vm_state building and task_state spawning. [ 558.424300] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._sync_power_states {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 558.453568] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Getting list of instances from cluster (obj){ [ 558.453568] env[65680]: value = "domain-c8" [ 558.453568] env[65680]: _type = "ClusterComputeResource" [ 558.453568] env[65680]: } {{(pid=65680) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 558.456501] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb536287-1304-4f4c-9958-ea9982219ce2 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.472027] env[65680]: DEBUG nova.network.neutron [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Successfully created port: 72f5bc74-d38e-4fb4-a820-74f411e1c78e {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 558.474747] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Got total of 4 instances {{(pid=65680) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 558.474963] env[65680]: WARNING nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] While synchronizing instance power states, found 6 instances in the database and 4 instances on the hypervisor. [ 558.475103] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Triggering sync for uuid eecb0c81-3810-4edd-b2da-032be990dcdc {{(pid=65680) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10224}} [ 558.475292] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Triggering sync for uuid a8b4f796-2893-4c05-be82-16a1bfd46db9 {{(pid=65680) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10224}} [ 558.475416] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Triggering sync for uuid 4bea49fd-7709-4aa8-86ac-c08ee943dd73 {{(pid=65680) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10224}} [ 558.475565] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Triggering sync for uuid d98c190b-7d45-4e74-909d-75b38bfc6554 {{(pid=65680) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10224}} [ 558.475712] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Triggering sync for uuid 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45 {{(pid=65680) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10224}} [ 558.475856] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Triggering sync for uuid 059f5688-3497-40bd-bf18-9c0748f3bdd6 {{(pid=65680) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10224}} [ 558.476196] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "eecb0c81-3810-4edd-b2da-032be990dcdc" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 558.476384] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "a8b4f796-2893-4c05-be82-16a1bfd46db9" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 558.476586] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "4bea49fd-7709-4aa8-86ac-c08ee943dd73" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 558.476781] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "d98c190b-7d45-4e74-909d-75b38bfc6554" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 558.476966] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "53d9fa9e-f645-47a7-9990-4ad3bfb4ca45" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 558.477484] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "059f5688-3497-40bd-bf18-9c0748f3bdd6" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 558.477680] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 558.478021] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Getting list of instances from cluster (obj){ [ 558.478021] env[65680]: value = "domain-c8" [ 558.478021] env[65680]: _type = "ClusterComputeResource" [ 558.478021] env[65680]: } {{(pid=65680) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 558.479401] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55895e48-3203-46c0-b5e7-b543eb577b8b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.493200] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Got total of 4 instances {{(pid=65680) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 559.849182] env[65680]: DEBUG nova.network.neutron [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Successfully updated port: c6374e08-c390-4e13-9080-80a7c5dd3dc9 {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 559.859733] env[65680]: DEBUG oslo_concurrency.lockutils [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Acquiring lock "refresh_cache-53d9fa9e-f645-47a7-9990-4ad3bfb4ca45" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 559.859886] env[65680]: DEBUG oslo_concurrency.lockutils [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Acquired lock "refresh_cache-53d9fa9e-f645-47a7-9990-4ad3bfb4ca45" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 559.860073] env[65680]: DEBUG nova.network.neutron [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 559.976278] env[65680]: DEBUG nova.network.neutron [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 561.042911] env[65680]: DEBUG nova.network.neutron [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Updating instance_info_cache with network_info: [{"id": "c6374e08-c390-4e13-9080-80a7c5dd3dc9", "address": "fa:16:3e:fe:3f:25", "network": {"id": "9c4fcc45-f838-4e30-8925-973d1c2f9a4b", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1388453852-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9a6709b4017c4ee086e003690484af4e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b5c60ce-845e-4506-bc10-348461fece6d", "external-id": "nsx-vlan-transportzone-831", "segmentation_id": 831, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc6374e08-c3", "ovs_interfaceid": "c6374e08-c390-4e13-9080-80a7c5dd3dc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 561.055785] env[65680]: DEBUG oslo_concurrency.lockutils [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Releasing lock "refresh_cache-53d9fa9e-f645-47a7-9990-4ad3bfb4ca45" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 561.056119] env[65680]: DEBUG nova.compute.manager [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Instance network_info: |[{"id": "c6374e08-c390-4e13-9080-80a7c5dd3dc9", "address": "fa:16:3e:fe:3f:25", "network": {"id": "9c4fcc45-f838-4e30-8925-973d1c2f9a4b", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1388453852-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9a6709b4017c4ee086e003690484af4e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b5c60ce-845e-4506-bc10-348461fece6d", "external-id": "nsx-vlan-transportzone-831", "segmentation_id": 831, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc6374e08-c3", "ovs_interfaceid": "c6374e08-c390-4e13-9080-80a7c5dd3dc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 561.056518] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:fe:3f:25', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4b5c60ce-845e-4506-bc10-348461fece6d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c6374e08-c390-4e13-9080-80a7c5dd3dc9', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 561.069634] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Creating folder: Project (9a6709b4017c4ee086e003690484af4e). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 561.070366] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-603c8ae4-ee0f-4c16-bdb5-519274d0be3a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.084046] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Created folder: Project (9a6709b4017c4ee086e003690484af4e) in parent group-v572532. [ 561.084046] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Creating folder: Instances. Parent ref: group-v572545. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 561.084046] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-37f6ada6-e138-4e2b-a862-e936ca9aa9da {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.098119] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Created folder: Instances in parent group-v572545. [ 561.098375] env[65680]: DEBUG oslo.service.loopingcall [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 561.098806] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 561.098806] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e5ec352b-bc57-4fb9-9f06-1f50ce45bd0d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.122956] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 561.122956] env[65680]: value = "task-2847836" [ 561.122956] env[65680]: _type = "Task" [ 561.122956] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 561.134999] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847836, 'name': CreateVM_Task} progress is 5%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 561.236061] env[65680]: DEBUG nova.compute.manager [req-e29a32ad-abe3-4467-aa4d-3c79e927ec18 req-18e9d190-d5b2-4c3d-b2a9-7e98e7793c99 service nova] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Received event network-changed-0845dcf6-d5f1-44e3-ad80-12eeb1087407 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 561.236061] env[65680]: DEBUG nova.compute.manager [req-e29a32ad-abe3-4467-aa4d-3c79e927ec18 req-18e9d190-d5b2-4c3d-b2a9-7e98e7793c99 service nova] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Refreshing instance network info cache due to event network-changed-0845dcf6-d5f1-44e3-ad80-12eeb1087407. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 561.236061] env[65680]: DEBUG oslo_concurrency.lockutils [req-e29a32ad-abe3-4467-aa4d-3c79e927ec18 req-18e9d190-d5b2-4c3d-b2a9-7e98e7793c99 service nova] Acquiring lock "refresh_cache-a8b4f796-2893-4c05-be82-16a1bfd46db9" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 561.236061] env[65680]: DEBUG oslo_concurrency.lockutils [req-e29a32ad-abe3-4467-aa4d-3c79e927ec18 req-18e9d190-d5b2-4c3d-b2a9-7e98e7793c99 service nova] Acquired lock "refresh_cache-a8b4f796-2893-4c05-be82-16a1bfd46db9" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 561.236061] env[65680]: DEBUG nova.network.neutron [req-e29a32ad-abe3-4467-aa4d-3c79e927ec18 req-18e9d190-d5b2-4c3d-b2a9-7e98e7793c99 service nova] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Refreshing network info cache for port 0845dcf6-d5f1-44e3-ad80-12eeb1087407 {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 561.637624] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847836, 'name': CreateVM_Task, 'duration_secs': 0.334377} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 561.637800] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 561.638500] env[65680]: DEBUG oslo_concurrency.lockutils [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 561.638648] env[65680]: DEBUG oslo_concurrency.lockutils [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 561.638958] env[65680]: DEBUG oslo_concurrency.lockutils [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 561.639244] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0c1dc541-9e78-45a7-b0bd-1fd1044d6d43 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.645464] env[65680]: DEBUG oslo_vmware.api [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Waiting for the task: (returnval){ [ 561.645464] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52fd7f48-bfb9-363c-fb1f-e43f3dcf76cd" [ 561.645464] env[65680]: _type = "Task" [ 561.645464] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 561.656077] env[65680]: DEBUG oslo_vmware.api [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52fd7f48-bfb9-363c-fb1f-e43f3dcf76cd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 561.670778] env[65680]: DEBUG nova.compute.manager [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Received event network-changed-332a8d1b-b53c-4a60-a737-eaa3a81d494b {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 561.670962] env[65680]: DEBUG nova.compute.manager [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Refreshing instance network info cache due to event network-changed-332a8d1b-b53c-4a60-a737-eaa3a81d494b. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 561.672337] env[65680]: DEBUG oslo_concurrency.lockutils [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] Acquiring lock "refresh_cache-4bea49fd-7709-4aa8-86ac-c08ee943dd73" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 561.672491] env[65680]: DEBUG oslo_concurrency.lockutils [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] Acquired lock "refresh_cache-4bea49fd-7709-4aa8-86ac-c08ee943dd73" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 561.672654] env[65680]: DEBUG nova.network.neutron [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Refreshing network info cache for port 332a8d1b-b53c-4a60-a737-eaa3a81d494b {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 562.128166] env[65680]: DEBUG nova.network.neutron [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Successfully updated port: 72f5bc74-d38e-4fb4-a820-74f411e1c78e {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 562.141558] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Acquiring lock "refresh_cache-059f5688-3497-40bd-bf18-9c0748f3bdd6" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 562.141731] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Acquired lock "refresh_cache-059f5688-3497-40bd-bf18-9c0748f3bdd6" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 562.142366] env[65680]: DEBUG nova.network.neutron [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 562.158671] env[65680]: DEBUG oslo_concurrency.lockutils [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 562.159276] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 562.159502] env[65680]: DEBUG oslo_concurrency.lockutils [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 562.325699] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 562.325699] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 562.325972] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Starting heal instance info cache {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 562.325972] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Rebuilding the list of instances to heal {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 562.332419] env[65680]: DEBUG nova.network.neutron [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 562.357466] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 562.357607] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 562.357729] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 562.357855] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 562.357977] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 562.358113] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 562.358234] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Didn't find any instances for network info cache update. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 562.359894] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 562.360028] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 562.360507] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 562.360507] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 562.360709] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 562.360821] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 562.360920] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65680) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 562.361330] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager.update_available_resource {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 562.374981] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 562.375314] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 562.375539] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 562.375702] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65680) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 562.377985] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-944d6b8c-f33f-43b2-b984-dc3e3ebc646a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.391095] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5b6a546-760f-4a3a-8a87-0bbe64feb3a6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.406547] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-263e5dd3-9c17-4c38-9085-de957549a62a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.418133] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba186040-e85b-4c27-b718-36d900ad58ce {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.451846] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181092MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65680) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 562.452076] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 562.452864] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 562.578863] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance eecb0c81-3810-4edd-b2da-032be990dcdc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 562.579084] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance a8b4f796-2893-4c05-be82-16a1bfd46db9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 562.579247] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 4bea49fd-7709-4aa8-86ac-c08ee943dd73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 562.579375] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance d98c190b-7d45-4e74-909d-75b38bfc6554 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 562.579497] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 562.579616] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 059f5688-3497-40bd-bf18-9c0748f3bdd6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 562.579805] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 562.580190] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 562.677526] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0271e21-4082-43ed-ad7e-392a099ccbd3 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.685649] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6b3bb55-c0b5-461c-be1b-79e01b26dbee {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.719274] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff4a03e2-4aec-411a-a214-0b9a07ce26a3 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.727502] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08c6bfd0-6d5b-4df2-adee-d1a6c428405b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.741784] env[65680]: DEBUG nova.compute.provider_tree [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 562.752947] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 562.769585] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65680) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 562.769762] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.317s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 563.143939] env[65680]: DEBUG nova.network.neutron [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Updated VIF entry in instance network info cache for port 332a8d1b-b53c-4a60-a737-eaa3a81d494b. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 563.143939] env[65680]: DEBUG nova.network.neutron [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Updating instance_info_cache with network_info: [{"id": "332a8d1b-b53c-4a60-a737-eaa3a81d494b", "address": "fa:16:3e:a2:8c:99", "network": {"id": "a9d1bb29-b5b6-4def-ba99-47d740f2c8b7", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.156", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "37ed348cd06e407e8b18e9a9365b037b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c02dd284-ab80-451c-93eb-48c8360acb9c", "external-id": "nsx-vlan-transportzone-818", "segmentation_id": 818, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap332a8d1b-b5", "ovs_interfaceid": "332a8d1b-b53c-4a60-a737-eaa3a81d494b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 563.153897] env[65680]: DEBUG oslo_concurrency.lockutils [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] Releasing lock "refresh_cache-4bea49fd-7709-4aa8-86ac-c08ee943dd73" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 563.154165] env[65680]: DEBUG nova.compute.manager [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Received event network-vif-plugged-50897877-7974-45f4-be58-52d3c88d26c1 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 563.154358] env[65680]: DEBUG oslo_concurrency.lockutils [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] Acquiring lock "d98c190b-7d45-4e74-909d-75b38bfc6554-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 563.154556] env[65680]: DEBUG oslo_concurrency.lockutils [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] Lock "d98c190b-7d45-4e74-909d-75b38bfc6554-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 563.154810] env[65680]: DEBUG oslo_concurrency.lockutils [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] Lock "d98c190b-7d45-4e74-909d-75b38bfc6554-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 563.154916] env[65680]: DEBUG nova.compute.manager [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] No waiting events found dispatching network-vif-plugged-50897877-7974-45f4-be58-52d3c88d26c1 {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 563.155117] env[65680]: WARNING nova.compute.manager [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Received unexpected event network-vif-plugged-50897877-7974-45f4-be58-52d3c88d26c1 for instance with vm_state building and task_state spawning. [ 563.155408] env[65680]: DEBUG nova.compute.manager [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Received event network-changed-50897877-7974-45f4-be58-52d3c88d26c1 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 563.155586] env[65680]: DEBUG nova.compute.manager [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Refreshing instance network info cache due to event network-changed-50897877-7974-45f4-be58-52d3c88d26c1. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 563.155772] env[65680]: DEBUG oslo_concurrency.lockutils [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] Acquiring lock "refresh_cache-d98c190b-7d45-4e74-909d-75b38bfc6554" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 563.156817] env[65680]: DEBUG oslo_concurrency.lockutils [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] Acquired lock "refresh_cache-d98c190b-7d45-4e74-909d-75b38bfc6554" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 563.157041] env[65680]: DEBUG nova.network.neutron [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Refreshing network info cache for port 50897877-7974-45f4-be58-52d3c88d26c1 {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 563.359633] env[65680]: DEBUG nova.network.neutron [req-e29a32ad-abe3-4467-aa4d-3c79e927ec18 req-18e9d190-d5b2-4c3d-b2a9-7e98e7793c99 service nova] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Updated VIF entry in instance network info cache for port 0845dcf6-d5f1-44e3-ad80-12eeb1087407. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 563.359996] env[65680]: DEBUG nova.network.neutron [req-e29a32ad-abe3-4467-aa4d-3c79e927ec18 req-18e9d190-d5b2-4c3d-b2a9-7e98e7793c99 service nova] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Updating instance_info_cache with network_info: [{"id": "0845dcf6-d5f1-44e3-ad80-12eeb1087407", "address": "fa:16:3e:00:35:71", "network": {"id": "a9d1bb29-b5b6-4def-ba99-47d740f2c8b7", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.128", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "37ed348cd06e407e8b18e9a9365b037b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c02dd284-ab80-451c-93eb-48c8360acb9c", "external-id": "nsx-vlan-transportzone-818", "segmentation_id": 818, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0845dcf6-d5", "ovs_interfaceid": "0845dcf6-d5f1-44e3-ad80-12eeb1087407", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 563.374147] env[65680]: DEBUG oslo_concurrency.lockutils [req-e29a32ad-abe3-4467-aa4d-3c79e927ec18 req-18e9d190-d5b2-4c3d-b2a9-7e98e7793c99 service nova] Releasing lock "refresh_cache-a8b4f796-2893-4c05-be82-16a1bfd46db9" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 563.623122] env[65680]: DEBUG nova.network.neutron [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Updating instance_info_cache with network_info: [{"id": "72f5bc74-d38e-4fb4-a820-74f411e1c78e", "address": "fa:16:3e:06:b2:25", "network": {"id": "c0036600-1c1b-41df-889b-5d47eba90eb7", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1651873931-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0f8403f4435a4c8c872696e896f983ae", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f78b07ea-f425-4622-84f4-706a5d8820a7", "external-id": "nsx-vlan-transportzone-126", "segmentation_id": 126, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap72f5bc74-d3", "ovs_interfaceid": "72f5bc74-d38e-4fb4-a820-74f411e1c78e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 563.634236] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Releasing lock "refresh_cache-059f5688-3497-40bd-bf18-9c0748f3bdd6" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 563.634562] env[65680]: DEBUG nova.compute.manager [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Instance network_info: |[{"id": "72f5bc74-d38e-4fb4-a820-74f411e1c78e", "address": "fa:16:3e:06:b2:25", "network": {"id": "c0036600-1c1b-41df-889b-5d47eba90eb7", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1651873931-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0f8403f4435a4c8c872696e896f983ae", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f78b07ea-f425-4622-84f4-706a5d8820a7", "external-id": "nsx-vlan-transportzone-126", "segmentation_id": 126, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap72f5bc74-d3", "ovs_interfaceid": "72f5bc74-d38e-4fb4-a820-74f411e1c78e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 563.634943] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:06:b2:25', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f78b07ea-f425-4622-84f4-706a5d8820a7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '72f5bc74-d38e-4fb4-a820-74f411e1c78e', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 563.646277] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Creating folder: Project (0f8403f4435a4c8c872696e896f983ae). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 563.649905] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a147d72b-d023-47bd-ae6c-05cba59ba9ab {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 563.664364] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Created folder: Project (0f8403f4435a4c8c872696e896f983ae) in parent group-v572532. [ 563.664499] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Creating folder: Instances. Parent ref: group-v572548. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 563.664763] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2ebebee9-0fe9-4309-a0ac-85d90e2f95c0 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 563.677072] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Created folder: Instances in parent group-v572548. [ 563.677339] env[65680]: DEBUG oslo.service.loopingcall [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 563.677480] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 563.678206] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-21236c04-db2b-44c7-9c33-a5d519e3c94d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 563.701667] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 563.701667] env[65680]: value = "task-2847839" [ 563.701667] env[65680]: _type = "Task" [ 563.701667] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 563.714091] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847839, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 563.728159] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Acquiring lock "fc14c935-fe84-4a49-ac1e-575e56b672a3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 563.728485] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Lock "fc14c935-fe84-4a49-ac1e-575e56b672a3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 563.740161] env[65680]: DEBUG nova.compute.manager [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 563.810057] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 563.810362] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 563.811886] env[65680]: INFO nova.compute.claims [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 563.875179] env[65680]: DEBUG nova.network.neutron [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Updated VIF entry in instance network info cache for port 50897877-7974-45f4-be58-52d3c88d26c1. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 563.876099] env[65680]: DEBUG nova.network.neutron [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Updating instance_info_cache with network_info: [{"id": "50897877-7974-45f4-be58-52d3c88d26c1", "address": "fa:16:3e:a5:46:8d", "network": {"id": "befe768f-b79e-4bbf-8bb5-1f80b78e45db", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1138998705-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2b48a41a6b634f9fa85b86451056713e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92f3cfd6-c130-4390-8910-865fbc42afd1", "external-id": "nsx-vlan-transportzone-142", "segmentation_id": 142, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap50897877-79", "ovs_interfaceid": "50897877-7974-45f4-be58-52d3c88d26c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 563.893915] env[65680]: DEBUG oslo_concurrency.lockutils [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] Releasing lock "refresh_cache-d98c190b-7d45-4e74-909d-75b38bfc6554" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 563.894241] env[65680]: DEBUG nova.compute.manager [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Received event network-vif-plugged-c6374e08-c390-4e13-9080-80a7c5dd3dc9 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 563.894457] env[65680]: DEBUG oslo_concurrency.lockutils [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] Acquiring lock "53d9fa9e-f645-47a7-9990-4ad3bfb4ca45-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 563.894668] env[65680]: DEBUG oslo_concurrency.lockutils [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] Lock "53d9fa9e-f645-47a7-9990-4ad3bfb4ca45-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 563.894822] env[65680]: DEBUG oslo_concurrency.lockutils [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] Lock "53d9fa9e-f645-47a7-9990-4ad3bfb4ca45-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 563.894992] env[65680]: DEBUG nova.compute.manager [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] No waiting events found dispatching network-vif-plugged-c6374e08-c390-4e13-9080-80a7c5dd3dc9 {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 563.895194] env[65680]: WARNING nova.compute.manager [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Received unexpected event network-vif-plugged-c6374e08-c390-4e13-9080-80a7c5dd3dc9 for instance with vm_state building and task_state spawning. [ 563.895348] env[65680]: DEBUG nova.compute.manager [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Received event network-changed-c6374e08-c390-4e13-9080-80a7c5dd3dc9 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 563.895492] env[65680]: DEBUG nova.compute.manager [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Refreshing instance network info cache due to event network-changed-c6374e08-c390-4e13-9080-80a7c5dd3dc9. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 563.895685] env[65680]: DEBUG oslo_concurrency.lockutils [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] Acquiring lock "refresh_cache-53d9fa9e-f645-47a7-9990-4ad3bfb4ca45" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 563.895824] env[65680]: DEBUG oslo_concurrency.lockutils [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] Acquired lock "refresh_cache-53d9fa9e-f645-47a7-9990-4ad3bfb4ca45" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 563.895985] env[65680]: DEBUG nova.network.neutron [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Refreshing network info cache for port c6374e08-c390-4e13-9080-80a7c5dd3dc9 {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 564.006179] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-243d04b1-728f-4b88-a3b3-2c3dbcf4240f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.013416] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d9e6be0-65bb-4bde-9a48-dbf2d59ba8ca {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.045113] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99d3d253-846a-446b-a30c-6fa37c90426f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.053394] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1da02e2f-4dbf-48ef-bc8d-7a54150b1423 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.065907] env[65680]: DEBUG nova.compute.provider_tree [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 564.074260] env[65680]: DEBUG nova.scheduler.client.report [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 564.092039] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.278s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 564.092039] env[65680]: DEBUG nova.compute.manager [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 564.124140] env[65680]: DEBUG nova.compute.utils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 564.126737] env[65680]: DEBUG nova.compute.manager [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Not allocating networking since 'none' was specified. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 564.137913] env[65680]: DEBUG nova.compute.manager [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 564.224279] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847839, 'name': CreateVM_Task, 'duration_secs': 0.30749} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 564.224528] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 564.226407] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 564.226407] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 564.226407] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 564.226407] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-47da3422-bb39-4426-bfad-c6fdc979b713 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.231268] env[65680]: DEBUG oslo_vmware.api [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Waiting for the task: (returnval){ [ 564.231268] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]525120c5-2e1a-f629-f1e2-9640f3b1aa71" [ 564.231268] env[65680]: _type = "Task" [ 564.231268] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 564.240166] env[65680]: DEBUG oslo_vmware.api [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]525120c5-2e1a-f629-f1e2-9640f3b1aa71, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 564.253670] env[65680]: DEBUG nova.compute.manager [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 564.283731] env[65680]: DEBUG nova.virt.hardware [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 564.284279] env[65680]: DEBUG nova.virt.hardware [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 564.284279] env[65680]: DEBUG nova.virt.hardware [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 564.284279] env[65680]: DEBUG nova.virt.hardware [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 564.284422] env[65680]: DEBUG nova.virt.hardware [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 564.284583] env[65680]: DEBUG nova.virt.hardware [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 564.284786] env[65680]: DEBUG nova.virt.hardware [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 564.284931] env[65680]: DEBUG nova.virt.hardware [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 564.285099] env[65680]: DEBUG nova.virt.hardware [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 564.285252] env[65680]: DEBUG nova.virt.hardware [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 564.285411] env[65680]: DEBUG nova.virt.hardware [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 564.286277] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76187e07-78c5-4f1b-8ef3-bf2d482300a3 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.294767] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-435618cc-029b-4fa8-bb27-1c3a62d5fed5 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.316644] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Instance VIF info [] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 564.323312] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Creating folder: Project (edb05b49329b402b9e7db6367c95525d). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 564.324119] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6a9b6323-bd3f-4ac9-89fe-9859a4cc61c4 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.337416] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Created folder: Project (edb05b49329b402b9e7db6367c95525d) in parent group-v572532. [ 564.337416] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Creating folder: Instances. Parent ref: group-v572551. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 564.337416] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a228429a-632b-46a4-be6b-7c8b2bcd2a74 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.347157] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Created folder: Instances in parent group-v572551. [ 564.347552] env[65680]: DEBUG oslo.service.loopingcall [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 564.347840] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 564.348179] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-66ddd0f1-917f-4dfd-876b-aee20e1799b9 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.367042] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 564.367042] env[65680]: value = "task-2847842" [ 564.367042] env[65680]: _type = "Task" [ 564.367042] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 564.375906] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847842, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 564.745812] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 564.747035] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 564.750125] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 564.877492] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847842, 'name': CreateVM_Task, 'duration_secs': 0.234717} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 564.877785] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 564.880575] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 564.880575] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 564.880575] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 564.880575] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e9a0f2eb-8167-4bee-a74e-41c41af3b658 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.884779] env[65680]: DEBUG oslo_vmware.api [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Waiting for the task: (returnval){ [ 564.884779] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]528c2fb8-4491-dc86-e054-bc34fac34d97" [ 564.884779] env[65680]: _type = "Task" [ 564.884779] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 564.899863] env[65680]: DEBUG oslo_vmware.api [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]528c2fb8-4491-dc86-e054-bc34fac34d97, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 564.901679] env[65680]: DEBUG nova.network.neutron [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Updated VIF entry in instance network info cache for port c6374e08-c390-4e13-9080-80a7c5dd3dc9. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 564.902018] env[65680]: DEBUG nova.network.neutron [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Updating instance_info_cache with network_info: [{"id": "c6374e08-c390-4e13-9080-80a7c5dd3dc9", "address": "fa:16:3e:fe:3f:25", "network": {"id": "9c4fcc45-f838-4e30-8925-973d1c2f9a4b", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1388453852-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9a6709b4017c4ee086e003690484af4e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b5c60ce-845e-4506-bc10-348461fece6d", "external-id": "nsx-vlan-transportzone-831", "segmentation_id": 831, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc6374e08-c3", "ovs_interfaceid": "c6374e08-c390-4e13-9080-80a7c5dd3dc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 564.916218] env[65680]: DEBUG oslo_concurrency.lockutils [req-a7b2b5e2-f8f6-47f7-a39c-7fec00caafa7 req-dc62a94c-32de-46ed-8b1c-3904dd202030 service nova] Releasing lock "refresh_cache-53d9fa9e-f645-47a7-9990-4ad3bfb4ca45" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 565.395913] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 565.396201] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 565.396381] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 566.143262] env[65680]: DEBUG nova.compute.manager [req-b2210a0f-c7a0-4f5b-b2be-a06ceea65457 req-7497ca3b-8ec0-4677-81d6-3f7247130dbd service nova] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Received event network-vif-plugged-72f5bc74-d38e-4fb4-a820-74f411e1c78e {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 566.143475] env[65680]: DEBUG oslo_concurrency.lockutils [req-b2210a0f-c7a0-4f5b-b2be-a06ceea65457 req-7497ca3b-8ec0-4677-81d6-3f7247130dbd service nova] Acquiring lock "059f5688-3497-40bd-bf18-9c0748f3bdd6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 566.143736] env[65680]: DEBUG oslo_concurrency.lockutils [req-b2210a0f-c7a0-4f5b-b2be-a06ceea65457 req-7497ca3b-8ec0-4677-81d6-3f7247130dbd service nova] Lock "059f5688-3497-40bd-bf18-9c0748f3bdd6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 566.143833] env[65680]: DEBUG oslo_concurrency.lockutils [req-b2210a0f-c7a0-4f5b-b2be-a06ceea65457 req-7497ca3b-8ec0-4677-81d6-3f7247130dbd service nova] Lock "059f5688-3497-40bd-bf18-9c0748f3bdd6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 566.143993] env[65680]: DEBUG nova.compute.manager [req-b2210a0f-c7a0-4f5b-b2be-a06ceea65457 req-7497ca3b-8ec0-4677-81d6-3f7247130dbd service nova] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] No waiting events found dispatching network-vif-plugged-72f5bc74-d38e-4fb4-a820-74f411e1c78e {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 566.145318] env[65680]: WARNING nova.compute.manager [req-b2210a0f-c7a0-4f5b-b2be-a06ceea65457 req-7497ca3b-8ec0-4677-81d6-3f7247130dbd service nova] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Received unexpected event network-vif-plugged-72f5bc74-d38e-4fb4-a820-74f411e1c78e for instance with vm_state building and task_state spawning. [ 566.145507] env[65680]: DEBUG nova.compute.manager [req-b2210a0f-c7a0-4f5b-b2be-a06ceea65457 req-7497ca3b-8ec0-4677-81d6-3f7247130dbd service nova] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Received event network-changed-72f5bc74-d38e-4fb4-a820-74f411e1c78e {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 566.145667] env[65680]: DEBUG nova.compute.manager [req-b2210a0f-c7a0-4f5b-b2be-a06ceea65457 req-7497ca3b-8ec0-4677-81d6-3f7247130dbd service nova] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Refreshing instance network info cache due to event network-changed-72f5bc74-d38e-4fb4-a820-74f411e1c78e. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 566.145857] env[65680]: DEBUG oslo_concurrency.lockutils [req-b2210a0f-c7a0-4f5b-b2be-a06ceea65457 req-7497ca3b-8ec0-4677-81d6-3f7247130dbd service nova] Acquiring lock "refresh_cache-059f5688-3497-40bd-bf18-9c0748f3bdd6" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 566.145993] env[65680]: DEBUG oslo_concurrency.lockutils [req-b2210a0f-c7a0-4f5b-b2be-a06ceea65457 req-7497ca3b-8ec0-4677-81d6-3f7247130dbd service nova] Acquired lock "refresh_cache-059f5688-3497-40bd-bf18-9c0748f3bdd6" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 566.146211] env[65680]: DEBUG nova.network.neutron [req-b2210a0f-c7a0-4f5b-b2be-a06ceea65457 req-7497ca3b-8ec0-4677-81d6-3f7247130dbd service nova] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Refreshing network info cache for port 72f5bc74-d38e-4fb4-a820-74f411e1c78e {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 567.155745] env[65680]: DEBUG nova.network.neutron [req-b2210a0f-c7a0-4f5b-b2be-a06ceea65457 req-7497ca3b-8ec0-4677-81d6-3f7247130dbd service nova] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Updated VIF entry in instance network info cache for port 72f5bc74-d38e-4fb4-a820-74f411e1c78e. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 567.156191] env[65680]: DEBUG nova.network.neutron [req-b2210a0f-c7a0-4f5b-b2be-a06ceea65457 req-7497ca3b-8ec0-4677-81d6-3f7247130dbd service nova] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Updating instance_info_cache with network_info: [{"id": "72f5bc74-d38e-4fb4-a820-74f411e1c78e", "address": "fa:16:3e:06:b2:25", "network": {"id": "c0036600-1c1b-41df-889b-5d47eba90eb7", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1651873931-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0f8403f4435a4c8c872696e896f983ae", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f78b07ea-f425-4622-84f4-706a5d8820a7", "external-id": "nsx-vlan-transportzone-126", "segmentation_id": 126, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap72f5bc74-d3", "ovs_interfaceid": "72f5bc74-d38e-4fb4-a820-74f411e1c78e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 567.166827] env[65680]: DEBUG oslo_concurrency.lockutils [req-b2210a0f-c7a0-4f5b-b2be-a06ceea65457 req-7497ca3b-8ec0-4677-81d6-3f7247130dbd service nova] Releasing lock "refresh_cache-059f5688-3497-40bd-bf18-9c0748f3bdd6" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 569.175075] env[65680]: DEBUG oslo_concurrency.lockutils [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Acquiring lock "acbe2170-7ce3-4820-b082-6680e559bde1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 569.175477] env[65680]: DEBUG oslo_concurrency.lockutils [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Lock "acbe2170-7ce3-4820-b082-6680e559bde1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 569.198171] env[65680]: DEBUG nova.compute.manager [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 569.261024] env[65680]: DEBUG oslo_concurrency.lockutils [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 569.261024] env[65680]: DEBUG oslo_concurrency.lockutils [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 569.262054] env[65680]: INFO nova.compute.claims [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 569.474215] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33579547-60a7-4d95-b53c-a5a0337c8f9f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.484879] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca8046ca-bf43-4322-afec-e46948a241eb {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.520866] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7642665-1921-41d1-907b-c80ddf93dbf2 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.529364] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc66d47d-3b4a-4ee8-a4b7-2fa26ff0c942 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.545081] env[65680]: DEBUG nova.compute.provider_tree [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 569.557079] env[65680]: DEBUG nova.scheduler.client.report [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 569.578323] env[65680]: DEBUG oslo_concurrency.lockutils [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.318s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 569.579088] env[65680]: DEBUG nova.compute.manager [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 569.620744] env[65680]: DEBUG nova.compute.utils [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 569.621279] env[65680]: DEBUG nova.compute.manager [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 569.621514] env[65680]: DEBUG nova.network.neutron [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 569.632265] env[65680]: DEBUG nova.compute.manager [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 569.704782] env[65680]: DEBUG nova.compute.manager [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 569.728390] env[65680]: DEBUG nova.virt.hardware [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 569.728911] env[65680]: DEBUG nova.virt.hardware [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 569.729160] env[65680]: DEBUG nova.virt.hardware [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 569.729368] env[65680]: DEBUG nova.virt.hardware [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 569.729519] env[65680]: DEBUG nova.virt.hardware [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 569.729669] env[65680]: DEBUG nova.virt.hardware [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 569.729884] env[65680]: DEBUG nova.virt.hardware [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 569.730081] env[65680]: DEBUG nova.virt.hardware [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 569.730228] env[65680]: DEBUG nova.virt.hardware [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 569.730393] env[65680]: DEBUG nova.virt.hardware [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 569.730710] env[65680]: DEBUG nova.virt.hardware [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 569.731461] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a00d6052-2e6a-4d36-9f2c-a776995bad65 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.742637] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d24cf2a7-55f1-498e-b82c-3dbf93090dea {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.759570] env[65680]: DEBUG nova.policy [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4a1aefdc5fcc4101b93e7691f1a171d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2595bab122a24598824d60d33811ad74', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 570.617208] env[65680]: DEBUG nova.network.neutron [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Successfully created port: b5e3cc85-fae1-4bd2-b81d-a670c6cdd3d7 {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 571.939829] env[65680]: DEBUG nova.network.neutron [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Successfully updated port: b5e3cc85-fae1-4bd2-b81d-a670c6cdd3d7 {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 571.960802] env[65680]: DEBUG oslo_concurrency.lockutils [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Acquiring lock "refresh_cache-acbe2170-7ce3-4820-b082-6680e559bde1" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 571.960957] env[65680]: DEBUG oslo_concurrency.lockutils [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Acquired lock "refresh_cache-acbe2170-7ce3-4820-b082-6680e559bde1" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 571.961129] env[65680]: DEBUG nova.network.neutron [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 572.016682] env[65680]: DEBUG nova.network.neutron [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 572.283294] env[65680]: DEBUG nova.network.neutron [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Updating instance_info_cache with network_info: [{"id": "b5e3cc85-fae1-4bd2-b81d-a670c6cdd3d7", "address": "fa:16:3e:5c:cc:e8", "network": {"id": "a9d1bb29-b5b6-4def-ba99-47d740f2c8b7", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "37ed348cd06e407e8b18e9a9365b037b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c02dd284-ab80-451c-93eb-48c8360acb9c", "external-id": "nsx-vlan-transportzone-818", "segmentation_id": 818, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb5e3cc85-fa", "ovs_interfaceid": "b5e3cc85-fae1-4bd2-b81d-a670c6cdd3d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 572.299678] env[65680]: DEBUG oslo_concurrency.lockutils [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Releasing lock "refresh_cache-acbe2170-7ce3-4820-b082-6680e559bde1" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 572.299985] env[65680]: DEBUG nova.compute.manager [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Instance network_info: |[{"id": "b5e3cc85-fae1-4bd2-b81d-a670c6cdd3d7", "address": "fa:16:3e:5c:cc:e8", "network": {"id": "a9d1bb29-b5b6-4def-ba99-47d740f2c8b7", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "37ed348cd06e407e8b18e9a9365b037b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c02dd284-ab80-451c-93eb-48c8360acb9c", "external-id": "nsx-vlan-transportzone-818", "segmentation_id": 818, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb5e3cc85-fa", "ovs_interfaceid": "b5e3cc85-fae1-4bd2-b81d-a670c6cdd3d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 572.300625] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:5c:cc:e8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c02dd284-ab80-451c-93eb-48c8360acb9c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b5e3cc85-fae1-4bd2-b81d-a670c6cdd3d7', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 572.307901] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Creating folder: Project (2595bab122a24598824d60d33811ad74). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 572.308542] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-359d2abc-bc78-41ad-b7ab-3fb012c76f43 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.319245] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Created folder: Project (2595bab122a24598824d60d33811ad74) in parent group-v572532. [ 572.319245] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Creating folder: Instances. Parent ref: group-v572557. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 572.319245] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b338b313-c03e-420d-8b5e-bca0d1a37886 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.328760] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Created folder: Instances in parent group-v572557. [ 572.328995] env[65680]: DEBUG oslo.service.loopingcall [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 572.329209] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 572.329405] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-324b3bbc-e7fb-484e-a64f-9910552fef61 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.348427] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 572.348427] env[65680]: value = "task-2847850" [ 572.348427] env[65680]: _type = "Task" [ 572.348427] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 572.358020] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847850, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 572.858532] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847850, 'name': CreateVM_Task, 'duration_secs': 0.375126} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 572.858703] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 572.859414] env[65680]: DEBUG oslo_concurrency.lockutils [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 572.859586] env[65680]: DEBUG oslo_concurrency.lockutils [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 572.860055] env[65680]: DEBUG oslo_concurrency.lockutils [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 572.860255] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e19cb42c-322b-4d62-a227-a4ee7541265a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.864785] env[65680]: DEBUG oslo_vmware.api [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Waiting for the task: (returnval){ [ 572.864785] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]524fd8c3-4b32-957c-8e71-a393c4fad29f" [ 572.864785] env[65680]: _type = "Task" [ 572.864785] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 572.873488] env[65680]: DEBUG oslo_vmware.api [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]524fd8c3-4b32-957c-8e71-a393c4fad29f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 573.047281] env[65680]: DEBUG nova.compute.manager [req-7f760a57-ce30-4cb0-9dc6-73faae501284 req-95c023eb-c43c-4f73-96cc-e01d819a13f5 service nova] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Received event network-vif-plugged-b5e3cc85-fae1-4bd2-b81d-a670c6cdd3d7 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 573.047525] env[65680]: DEBUG oslo_concurrency.lockutils [req-7f760a57-ce30-4cb0-9dc6-73faae501284 req-95c023eb-c43c-4f73-96cc-e01d819a13f5 service nova] Acquiring lock "acbe2170-7ce3-4820-b082-6680e559bde1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 573.047698] env[65680]: DEBUG oslo_concurrency.lockutils [req-7f760a57-ce30-4cb0-9dc6-73faae501284 req-95c023eb-c43c-4f73-96cc-e01d819a13f5 service nova] Lock "acbe2170-7ce3-4820-b082-6680e559bde1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 573.047858] env[65680]: DEBUG oslo_concurrency.lockutils [req-7f760a57-ce30-4cb0-9dc6-73faae501284 req-95c023eb-c43c-4f73-96cc-e01d819a13f5 service nova] Lock "acbe2170-7ce3-4820-b082-6680e559bde1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 573.052140] env[65680]: DEBUG nova.compute.manager [req-7f760a57-ce30-4cb0-9dc6-73faae501284 req-95c023eb-c43c-4f73-96cc-e01d819a13f5 service nova] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] No waiting events found dispatching network-vif-plugged-b5e3cc85-fae1-4bd2-b81d-a670c6cdd3d7 {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 573.052140] env[65680]: WARNING nova.compute.manager [req-7f760a57-ce30-4cb0-9dc6-73faae501284 req-95c023eb-c43c-4f73-96cc-e01d819a13f5 service nova] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Received unexpected event network-vif-plugged-b5e3cc85-fae1-4bd2-b81d-a670c6cdd3d7 for instance with vm_state building and task_state spawning. [ 573.375120] env[65680]: DEBUG oslo_concurrency.lockutils [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 573.375443] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 573.375707] env[65680]: DEBUG oslo_concurrency.lockutils [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 575.720892] env[65680]: DEBUG nova.compute.manager [req-6912b6c3-e5c5-4671-802f-50199c27230d req-201f8540-e773-447e-a690-8558c8887424 service nova] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Received event network-changed-b5e3cc85-fae1-4bd2-b81d-a670c6cdd3d7 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 575.720892] env[65680]: DEBUG nova.compute.manager [req-6912b6c3-e5c5-4671-802f-50199c27230d req-201f8540-e773-447e-a690-8558c8887424 service nova] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Refreshing instance network info cache due to event network-changed-b5e3cc85-fae1-4bd2-b81d-a670c6cdd3d7. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 575.720892] env[65680]: DEBUG oslo_concurrency.lockutils [req-6912b6c3-e5c5-4671-802f-50199c27230d req-201f8540-e773-447e-a690-8558c8887424 service nova] Acquiring lock "refresh_cache-acbe2170-7ce3-4820-b082-6680e559bde1" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 575.720892] env[65680]: DEBUG oslo_concurrency.lockutils [req-6912b6c3-e5c5-4671-802f-50199c27230d req-201f8540-e773-447e-a690-8558c8887424 service nova] Acquired lock "refresh_cache-acbe2170-7ce3-4820-b082-6680e559bde1" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 575.720892] env[65680]: DEBUG nova.network.neutron [req-6912b6c3-e5c5-4671-802f-50199c27230d req-201f8540-e773-447e-a690-8558c8887424 service nova] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Refreshing network info cache for port b5e3cc85-fae1-4bd2-b81d-a670c6cdd3d7 {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 576.337474] env[65680]: DEBUG nova.network.neutron [req-6912b6c3-e5c5-4671-802f-50199c27230d req-201f8540-e773-447e-a690-8558c8887424 service nova] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Updated VIF entry in instance network info cache for port b5e3cc85-fae1-4bd2-b81d-a670c6cdd3d7. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 576.338417] env[65680]: DEBUG nova.network.neutron [req-6912b6c3-e5c5-4671-802f-50199c27230d req-201f8540-e773-447e-a690-8558c8887424 service nova] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Updating instance_info_cache with network_info: [{"id": "b5e3cc85-fae1-4bd2-b81d-a670c6cdd3d7", "address": "fa:16:3e:5c:cc:e8", "network": {"id": "a9d1bb29-b5b6-4def-ba99-47d740f2c8b7", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "37ed348cd06e407e8b18e9a9365b037b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c02dd284-ab80-451c-93eb-48c8360acb9c", "external-id": "nsx-vlan-transportzone-818", "segmentation_id": 818, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb5e3cc85-fa", "ovs_interfaceid": "b5e3cc85-fae1-4bd2-b81d-a670c6cdd3d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 576.351142] env[65680]: DEBUG oslo_concurrency.lockutils [req-6912b6c3-e5c5-4671-802f-50199c27230d req-201f8540-e773-447e-a690-8558c8887424 service nova] Releasing lock "refresh_cache-acbe2170-7ce3-4820-b082-6680e559bde1" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 596.762587] env[65680]: WARNING oslo_vmware.rw_handles [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 596.762587] env[65680]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 596.762587] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 596.762587] env[65680]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 596.762587] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 596.762587] env[65680]: ERROR oslo_vmware.rw_handles response.begin() [ 596.762587] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 596.762587] env[65680]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 596.762587] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 596.762587] env[65680]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 596.762587] env[65680]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 596.762587] env[65680]: ERROR oslo_vmware.rw_handles [ 596.763211] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Downloaded image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to vmware_temp/0ff2bdc5-41b5-4947-990f-42ebb4a0eab8/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 596.764143] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Caching image {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 596.764804] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Copying Virtual Disk [datastore1] vmware_temp/0ff2bdc5-41b5-4947-990f-42ebb4a0eab8/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk to [datastore1] vmware_temp/0ff2bdc5-41b5-4947-990f-42ebb4a0eab8/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk {{(pid=65680) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 596.764804] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9e2692f0-4cf3-4958-b6e6-b347d1cadeda {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.775571] env[65680]: DEBUG oslo_vmware.api [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Waiting for the task: (returnval){ [ 596.775571] env[65680]: value = "task-2847857" [ 596.775571] env[65680]: _type = "Task" [ 596.775571] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 596.786521] env[65680]: DEBUG oslo_vmware.api [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Task: {'id': task-2847857, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 597.287089] env[65680]: DEBUG oslo_vmware.exceptions [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Fault InvalidArgument not matched. {{(pid=65680) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 597.287448] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 597.291677] env[65680]: ERROR nova.compute.manager [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 597.291677] env[65680]: Faults: ['InvalidArgument'] [ 597.291677] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Traceback (most recent call last): [ 597.291677] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 597.291677] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] yield resources [ 597.291677] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 597.291677] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] self.driver.spawn(context, instance, image_meta, [ 597.291677] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 597.291677] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] self._vmops.spawn(context, instance, image_meta, injected_files, [ 597.291677] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 597.291677] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] self._fetch_image_if_missing(context, vi) [ 597.291677] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 597.292210] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] image_cache(vi, tmp_image_ds_loc) [ 597.292210] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 597.292210] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] vm_util.copy_virtual_disk( [ 597.292210] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 597.292210] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] session._wait_for_task(vmdk_copy_task) [ 597.292210] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 597.292210] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] return self.wait_for_task(task_ref) [ 597.292210] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 597.292210] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] return evt.wait() [ 597.292210] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 597.292210] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] result = hub.switch() [ 597.292210] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 597.292210] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] return self.greenlet.switch() [ 597.292611] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 597.292611] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] self.f(*self.args, **self.kw) [ 597.292611] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 597.292611] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] raise exceptions.translate_fault(task_info.error) [ 597.292611] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 597.292611] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Faults: ['InvalidArgument'] [ 597.292611] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] [ 597.292611] env[65680]: INFO nova.compute.manager [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Terminating instance [ 597.293692] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 597.293943] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 597.294887] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Acquiring lock "refresh_cache-eecb0c81-3810-4edd-b2da-032be990dcdc" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 597.294887] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Acquired lock "refresh_cache-eecb0c81-3810-4edd-b2da-032be990dcdc" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 597.294887] env[65680]: DEBUG nova.network.neutron [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 597.295885] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1256b5cc-ce68-4ff0-9788-9a52cf2031a4 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.309050] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 597.309250] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 597.310593] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2fbfc198-22ca-4c19-85da-5a7157906c9c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.318499] env[65680]: DEBUG oslo_vmware.api [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Waiting for the task: (returnval){ [ 597.318499] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52083593-ad6d-76dc-663e-bbf183db8eea" [ 597.318499] env[65680]: _type = "Task" [ 597.318499] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 597.331030] env[65680]: DEBUG oslo_vmware.api [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52083593-ad6d-76dc-663e-bbf183db8eea, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 597.383852] env[65680]: DEBUG nova.network.neutron [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 597.684643] env[65680]: DEBUG nova.network.neutron [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 597.696873] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Releasing lock "refresh_cache-eecb0c81-3810-4edd-b2da-032be990dcdc" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 597.697565] env[65680]: DEBUG nova.compute.manager [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 597.697565] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 597.698815] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-709fc509-825a-46d4-af1b-b9f043833051 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.708745] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 597.709007] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-70240a29-4cf0-4abe-b5e2-1c75b25213d7 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.744165] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 597.744576] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 597.744653] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Deleting the datastore file [datastore1] eecb0c81-3810-4edd-b2da-032be990dcdc {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 597.744865] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3b88b923-d0e9-4576-ac43-1a2380b3f2d6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.754881] env[65680]: DEBUG oslo_vmware.api [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Waiting for the task: (returnval){ [ 597.754881] env[65680]: value = "task-2847859" [ 597.754881] env[65680]: _type = "Task" [ 597.754881] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 597.765926] env[65680]: DEBUG oslo_vmware.api [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Task: {'id': task-2847859, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 597.839706] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 597.839706] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Creating directory with path [datastore1] vmware_temp/ed4eb7ba-c7bb-49b1-9ab7-ba60f7c226ed/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 597.839706] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b92cd23b-8d94-4d47-b647-3d2ea470a938 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.852342] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Created directory with path [datastore1] vmware_temp/ed4eb7ba-c7bb-49b1-9ab7-ba60f7c226ed/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 597.852627] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Fetch image to [datastore1] vmware_temp/ed4eb7ba-c7bb-49b1-9ab7-ba60f7c226ed/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 597.852807] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/ed4eb7ba-c7bb-49b1-9ab7-ba60f7c226ed/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 597.856107] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbafa345-0faa-467c-9b9d-688673be03c9 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.866286] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a20a8bb-45be-469c-8419-59dd836a1736 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.876576] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f43f4b51-ba24-4b06-9c24-447f84162771 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.917000] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bc56039-c7da-44c6-8423-cbf18a99a2b4 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.929629] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-821e18b3-14bd-4d12-90c2-5ef73586535c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.957417] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 598.014046] env[65680]: DEBUG oslo_vmware.rw_handles [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ed4eb7ba-c7bb-49b1-9ab7-ba60f7c226ed/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 598.084721] env[65680]: DEBUG oslo_vmware.rw_handles [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Completed reading data from the image iterator. {{(pid=65680) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 598.084721] env[65680]: DEBUG oslo_vmware.rw_handles [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ed4eb7ba-c7bb-49b1-9ab7-ba60f7c226ed/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 598.267494] env[65680]: DEBUG oslo_vmware.api [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Task: {'id': task-2847859, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.044263} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 598.267494] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 598.268019] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 598.268019] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 598.269454] env[65680]: INFO nova.compute.manager [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Took 0.57 seconds to destroy the instance on the hypervisor. [ 598.269454] env[65680]: DEBUG oslo.service.loopingcall [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 598.269454] env[65680]: DEBUG nova.compute.manager [-] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Skipping network deallocation for instance since networking was not requested. {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 598.271679] env[65680]: DEBUG nova.compute.claims [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 598.271679] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 598.271872] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 598.475021] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-336e0952-de29-4066-a098-41f6baaec809 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.480545] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7893731d-7912-4b4e-97e3-fa5b8adea54a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.517701] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37c7e0bc-4873-42d2-8295-7a183111f14e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.526724] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-825b74c9-0dd5-41bd-b085-259300be20d8 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.544618] env[65680]: DEBUG nova.compute.provider_tree [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 598.558931] env[65680]: DEBUG nova.scheduler.client.report [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 598.583020] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.308s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 598.583020] env[65680]: ERROR nova.compute.manager [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 598.583020] env[65680]: Faults: ['InvalidArgument'] [ 598.583020] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Traceback (most recent call last): [ 598.583020] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 598.583020] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] self.driver.spawn(context, instance, image_meta, [ 598.583020] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 598.583020] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] self._vmops.spawn(context, instance, image_meta, injected_files, [ 598.583020] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 598.583020] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] self._fetch_image_if_missing(context, vi) [ 598.583461] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 598.583461] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] image_cache(vi, tmp_image_ds_loc) [ 598.583461] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 598.583461] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] vm_util.copy_virtual_disk( [ 598.583461] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 598.583461] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] session._wait_for_task(vmdk_copy_task) [ 598.583461] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 598.583461] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] return self.wait_for_task(task_ref) [ 598.583461] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 598.583461] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] return evt.wait() [ 598.583461] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 598.583461] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] result = hub.switch() [ 598.583461] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 598.583824] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] return self.greenlet.switch() [ 598.583824] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 598.583824] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] self.f(*self.args, **self.kw) [ 598.583824] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 598.583824] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] raise exceptions.translate_fault(task_info.error) [ 598.583824] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 598.583824] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Faults: ['InvalidArgument'] [ 598.583824] env[65680]: ERROR nova.compute.manager [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] [ 598.583824] env[65680]: DEBUG nova.compute.utils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] VimFaultException {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 598.586427] env[65680]: DEBUG nova.compute.manager [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Build of instance eecb0c81-3810-4edd-b2da-032be990dcdc was re-scheduled: A specified parameter was not correct: fileType [ 598.586427] env[65680]: Faults: ['InvalidArgument'] {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 598.587066] env[65680]: DEBUG nova.compute.manager [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 598.587407] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Acquiring lock "refresh_cache-eecb0c81-3810-4edd-b2da-032be990dcdc" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 598.587501] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Acquired lock "refresh_cache-eecb0c81-3810-4edd-b2da-032be990dcdc" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 598.587583] env[65680]: DEBUG nova.network.neutron [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 598.643177] env[65680]: DEBUG nova.network.neutron [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 598.871501] env[65680]: DEBUG nova.network.neutron [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 598.884231] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Releasing lock "refresh_cache-eecb0c81-3810-4edd-b2da-032be990dcdc" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 598.884231] env[65680]: DEBUG nova.compute.manager [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 598.884231] env[65680]: DEBUG nova.compute.manager [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] Skipping network deallocation for instance since networking was not requested. {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 598.999016] env[65680]: INFO nova.scheduler.client.report [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Deleted allocations for instance eecb0c81-3810-4edd-b2da-032be990dcdc [ 599.025859] env[65680]: DEBUG oslo_concurrency.lockutils [None req-39508ab0-7ebc-4aa8-8d0b-404c08923ee6 tempest-ServerDiagnosticsV248Test-1787241679 tempest-ServerDiagnosticsV248Test-1787241679-project-member] Lock "eecb0c81-3810-4edd-b2da-032be990dcdc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 52.861s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 599.026119] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "eecb0c81-3810-4edd-b2da-032be990dcdc" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 40.550s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 599.026299] env[65680]: INFO nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: eecb0c81-3810-4edd-b2da-032be990dcdc] During sync_power_state the instance has a pending task (spawning). Skip. [ 599.026461] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "eecb0c81-3810-4edd-b2da-032be990dcdc" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 622.733372] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 622.733372] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 622.773473] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 622.773473] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Starting heal instance info cache {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 622.773473] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Rebuilding the list of instances to heal {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 622.798567] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 622.804034] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 622.804034] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 622.804034] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 622.804034] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 622.804034] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 622.804751] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 622.804751] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Didn't find any instances for network info cache update. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 622.804751] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 624.292724] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 624.293296] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 624.293296] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 624.293383] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 624.293497] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 624.296462] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65680) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 624.296462] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager.update_available_resource {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 624.320147] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 624.320147] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 624.320147] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 624.320147] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65680) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 624.320874] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-458e5d3f-9b29-45b5-89cc-44f1e07c4432 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.334712] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab744b8d-ad06-4650-9309-caeba8dc48eb {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.355795] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e5187c7-694e-4ea5-a6b8-3b3f468c5196 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.363966] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b814d17-8f2c-4a40-8faa-d0b3f4db580f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.396791] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181076MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65680) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 624.397342] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 624.397767] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 624.481802] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance a8b4f796-2893-4c05-be82-16a1bfd46db9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 624.481802] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 4bea49fd-7709-4aa8-86ac-c08ee943dd73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 624.481802] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance d98c190b-7d45-4e74-909d-75b38bfc6554 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 624.481802] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 624.482040] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 059f5688-3497-40bd-bf18-9c0748f3bdd6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 624.482040] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance fc14c935-fe84-4a49-ac1e-575e56b672a3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 624.482040] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance acbe2170-7ce3-4820-b082-6680e559bde1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 624.482040] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 624.482136] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 624.644060] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3eb1c8cd-9e40-4bc2-b89c-296a31ad3b2b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.650753] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4aa7cd3a-2585-40ee-a240-13502ba864d3 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.687023] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abce7a80-4633-46a9-9717-9db745b74fd7 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.695194] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-107a3751-979e-45d7-957e-7e83df5a4a06 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.709241] env[65680]: DEBUG nova.compute.provider_tree [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 624.723505] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 624.741455] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65680) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 624.741780] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.344s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.732584] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Acquiring lock "f05204a0-268f-4d77-a2bf-cde4ee02915e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.732584] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Lock "f05204a0-268f-4d77-a2bf-cde4ee02915e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.745821] env[65680]: DEBUG nova.compute.manager [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 631.817412] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.817725] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.821019] env[65680]: INFO nova.compute.claims [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 632.030660] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53ae8437-be21-4be1-ae83-b21da6a8815e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.042065] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95dc4b61-7723-4465-95f3-f2232736118c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.078204] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6afa010d-6896-4ad8-bf13-658a2df33a7b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.086269] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f966925-4e91-4743-b755-79e515024d2c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.100784] env[65680]: DEBUG nova.compute.provider_tree [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 632.109883] env[65680]: DEBUG nova.scheduler.client.report [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 632.131130] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.313s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 632.131130] env[65680]: DEBUG nova.compute.manager [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 632.180828] env[65680]: DEBUG nova.compute.utils [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 632.182494] env[65680]: DEBUG nova.compute.manager [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 632.183136] env[65680]: DEBUG nova.network.neutron [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 632.201376] env[65680]: DEBUG nova.compute.manager [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 632.308222] env[65680]: DEBUG nova.policy [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd27da803463e443f840365b8ab2ad84e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ed86a3040c784758a96761fe3d24bd59', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 632.315672] env[65680]: DEBUG nova.compute.manager [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 632.342226] env[65680]: DEBUG nova.virt.hardware [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 632.342226] env[65680]: DEBUG nova.virt.hardware [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 632.342226] env[65680]: DEBUG nova.virt.hardware [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 632.344525] env[65680]: DEBUG nova.virt.hardware [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 632.344525] env[65680]: DEBUG nova.virt.hardware [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 632.344525] env[65680]: DEBUG nova.virt.hardware [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 632.344525] env[65680]: DEBUG nova.virt.hardware [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 632.344525] env[65680]: DEBUG nova.virt.hardware [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 632.344691] env[65680]: DEBUG nova.virt.hardware [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 632.344691] env[65680]: DEBUG nova.virt.hardware [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 632.344691] env[65680]: DEBUG nova.virt.hardware [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 632.344691] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35190a5a-fdc0-4e98-9b6d-c4477f8f163a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.355113] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9912bf43-28bc-4d03-8e79-6d532eb60c33 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.440058] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Acquiring lock "f989cbee-9d5c-459f-b7a0-bf2259dadbb0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 632.441017] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Lock "f989cbee-9d5c-459f-b7a0-bf2259dadbb0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 632.453559] env[65680]: DEBUG nova.compute.manager [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 632.550775] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 632.551100] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 632.552559] env[65680]: INFO nova.compute.claims [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 632.765275] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f13e34a-9fb6-468c-999b-8cb5a6868717 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.773071] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aac5664d-55cb-4b55-b4cb-68e1e38a468d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.812976] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68a798b1-5946-4523-8209-c9737f858f22 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.820487] env[65680]: DEBUG oslo_concurrency.lockutils [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Acquiring lock "40a7ee3c-8627-47f3-887e-31112586e799" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 632.820896] env[65680]: DEBUG oslo_concurrency.lockutils [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Lock "40a7ee3c-8627-47f3-887e-31112586e799" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 632.828734] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b21fc536-8818-4ea3-9b82-a8d066535c1a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.841469] env[65680]: DEBUG nova.compute.provider_tree [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 632.845075] env[65680]: DEBUG nova.compute.manager [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 632.850998] env[65680]: DEBUG nova.scheduler.client.report [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 632.867262] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.316s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 632.868094] env[65680]: DEBUG nova.compute.manager [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 632.904898] env[65680]: DEBUG oslo_concurrency.lockutils [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 632.906615] env[65680]: DEBUG oslo_concurrency.lockutils [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 632.906935] env[65680]: INFO nova.compute.claims [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 632.916135] env[65680]: DEBUG nova.compute.utils [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 632.916135] env[65680]: DEBUG nova.compute.manager [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 632.916135] env[65680]: DEBUG nova.network.neutron [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 632.941250] env[65680]: DEBUG nova.compute.manager [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 633.038964] env[65680]: DEBUG nova.compute.manager [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 633.067093] env[65680]: DEBUG nova.virt.hardware [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 633.067371] env[65680]: DEBUG nova.virt.hardware [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 633.067526] env[65680]: DEBUG nova.virt.hardware [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 633.067738] env[65680]: DEBUG nova.virt.hardware [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 633.067884] env[65680]: DEBUG nova.virt.hardware [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 633.068856] env[65680]: DEBUG nova.virt.hardware [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 633.068856] env[65680]: DEBUG nova.virt.hardware [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 633.068856] env[65680]: DEBUG nova.virt.hardware [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 633.069199] env[65680]: DEBUG nova.virt.hardware [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 633.069199] env[65680]: DEBUG nova.virt.hardware [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 633.069251] env[65680]: DEBUG nova.virt.hardware [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 633.070171] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b25d998-f4d9-4581-b161-e2e6e7b1c0a0 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.084157] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc9caf87-5b8e-43f1-bc68-e22c20493f8e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.106095] env[65680]: DEBUG nova.policy [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5e63098fc9284c50950ab77e8b95f278', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5354c98e675493fbb00c885d0766ec5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 633.159952] env[65680]: DEBUG nova.network.neutron [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Successfully created port: ded655a7-5ab4-4feb-ab0c-65a6d60d802e {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 633.195223] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74660260-62ce-4181-a98d-1630611c5381 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.204292] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f675300d-e5ae-4d60-9866-0e20564f5cae {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.241666] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f12b4dee-696d-4ca7-b729-982920e44a5f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.250647] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a4f3d20-3e1f-4ee1-a282-17702cc10e47 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.268426] env[65680]: DEBUG nova.compute.provider_tree [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 633.284019] env[65680]: DEBUG nova.scheduler.client.report [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 633.311228] env[65680]: DEBUG oslo_concurrency.lockutils [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.406s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 633.311734] env[65680]: DEBUG nova.compute.manager [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 633.364334] env[65680]: DEBUG nova.compute.utils [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 633.365633] env[65680]: DEBUG nova.compute.manager [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 633.365801] env[65680]: DEBUG nova.network.neutron [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 633.383175] env[65680]: DEBUG nova.compute.manager [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 633.493967] env[65680]: DEBUG nova.compute.manager [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 633.523348] env[65680]: DEBUG nova.virt.hardware [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 633.523570] env[65680]: DEBUG nova.virt.hardware [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 633.523720] env[65680]: DEBUG nova.virt.hardware [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 633.523899] env[65680]: DEBUG nova.virt.hardware [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 633.524041] env[65680]: DEBUG nova.virt.hardware [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 633.524524] env[65680]: DEBUG nova.virt.hardware [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 633.524768] env[65680]: DEBUG nova.virt.hardware [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 633.524954] env[65680]: DEBUG nova.virt.hardware [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 633.525136] env[65680]: DEBUG nova.virt.hardware [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 633.525298] env[65680]: DEBUG nova.virt.hardware [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 633.525464] env[65680]: DEBUG nova.virt.hardware [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 633.527063] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f13daf5f-5c39-40ca-b29b-49cec1a4b1c6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.536764] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e76d30d6-ef96-4da0-9abb-12038168ad71 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.557811] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Acquiring lock "2f6ce1b8-d869-4219-851a-43ae3ddd3816" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.558057] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Lock "2f6ce1b8-d869-4219-851a-43ae3ddd3816" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.795095] env[65680]: DEBUG nova.policy [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9b22f6afcb294ff39813ce0c43ffb38e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '45cf99fd13fc4535802b72e3b07f301d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 634.467114] env[65680]: DEBUG nova.network.neutron [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Successfully created port: 2e0079a8-bad7-4a86-a3b1-1aed3ef0a7c7 {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 634.755806] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Acquiring lock "b163d5b8-b01c-4ace-96e7-56276ab4ba82" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 634.756265] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Lock "b163d5b8-b01c-4ace-96e7-56276ab4ba82" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 634.791554] env[65680]: DEBUG nova.network.neutron [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Successfully created port: 6fd47fa7-f60f-4555-b8ee-8bd5b78a3825 {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 635.130509] env[65680]: DEBUG nova.network.neutron [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Successfully updated port: ded655a7-5ab4-4feb-ab0c-65a6d60d802e {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 635.143772] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Acquiring lock "refresh_cache-f05204a0-268f-4d77-a2bf-cde4ee02915e" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 635.143981] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Acquired lock "refresh_cache-f05204a0-268f-4d77-a2bf-cde4ee02915e" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 635.144163] env[65680]: DEBUG nova.network.neutron [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 635.197705] env[65680]: DEBUG nova.network.neutron [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 635.418804] env[65680]: DEBUG nova.network.neutron [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Updating instance_info_cache with network_info: [{"id": "ded655a7-5ab4-4feb-ab0c-65a6d60d802e", "address": "fa:16:3e:26:7b:5b", "network": {"id": "0544c52d-341e-4305-b2f7-53638ebc0a5b", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1515093634-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ed86a3040c784758a96761fe3d24bd59", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "90878b7b-ddb7-4f47-892b-d6e06f73475f", "external-id": "nsx-vlan-transportzone-849", "segmentation_id": 849, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapded655a7-5a", "ovs_interfaceid": "ded655a7-5ab4-4feb-ab0c-65a6d60d802e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 635.443583] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Releasing lock "refresh_cache-f05204a0-268f-4d77-a2bf-cde4ee02915e" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 635.443911] env[65680]: DEBUG nova.compute.manager [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Instance network_info: |[{"id": "ded655a7-5ab4-4feb-ab0c-65a6d60d802e", "address": "fa:16:3e:26:7b:5b", "network": {"id": "0544c52d-341e-4305-b2f7-53638ebc0a5b", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1515093634-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ed86a3040c784758a96761fe3d24bd59", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "90878b7b-ddb7-4f47-892b-d6e06f73475f", "external-id": "nsx-vlan-transportzone-849", "segmentation_id": 849, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapded655a7-5a", "ovs_interfaceid": "ded655a7-5ab4-4feb-ab0c-65a6d60d802e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 635.444288] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:26:7b:5b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '90878b7b-ddb7-4f47-892b-d6e06f73475f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ded655a7-5ab4-4feb-ab0c-65a6d60d802e', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 635.453703] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Creating folder: Project (ed86a3040c784758a96761fe3d24bd59). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 635.454357] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-71881545-a1cc-4189-9f9f-3ade69b5b527 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.469108] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Created folder: Project (ed86a3040c784758a96761fe3d24bd59) in parent group-v572532. [ 635.469324] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Creating folder: Instances. Parent ref: group-v572565. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 635.469567] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d809fb36-1466-414c-bd10-da2e1175bab5 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.480803] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Created folder: Instances in parent group-v572565. [ 635.481087] env[65680]: DEBUG oslo.service.loopingcall [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 635.481286] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 635.481488] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7a106d7d-9f9d-4bda-9bc4-5b49aa692a5b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.503179] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 635.503179] env[65680]: value = "task-2847873" [ 635.503179] env[65680]: _type = "Task" [ 635.503179] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 635.511607] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847873, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 635.634108] env[65680]: DEBUG oslo_concurrency.lockutils [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Acquiring lock "b935e1a7-1c77-4398-a964-cd7da312fc1b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.634299] env[65680]: DEBUG oslo_concurrency.lockutils [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Lock "b935e1a7-1c77-4398-a964-cd7da312fc1b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 636.013673] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847873, 'name': CreateVM_Task, 'duration_secs': 0.29769} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 636.013837] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 636.017715] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 636.017715] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 636.017715] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 636.017715] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c666c0b3-51e9-42df-8972-f47292bd174d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.020510] env[65680]: DEBUG oslo_vmware.api [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Waiting for the task: (returnval){ [ 636.020510] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52e559a6-7fb7-976a-fe2a-df3b23c2c925" [ 636.020510] env[65680]: _type = "Task" [ 636.020510] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 636.028947] env[65680]: DEBUG oslo_vmware.api [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52e559a6-7fb7-976a-fe2a-df3b23c2c925, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 636.469376] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Acquiring lock "cb739449-a329-41b8-964c-8c9db383e846" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 636.471095] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Lock "cb739449-a329-41b8-964c-8c9db383e846" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 636.534632] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 636.535160] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 636.535393] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 637.007838] env[65680]: DEBUG nova.network.neutron [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Successfully updated port: 2e0079a8-bad7-4a86-a3b1-1aed3ef0a7c7 {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 637.021403] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Acquiring lock "refresh_cache-f989cbee-9d5c-459f-b7a0-bf2259dadbb0" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 637.022123] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Acquired lock "refresh_cache-f989cbee-9d5c-459f-b7a0-bf2259dadbb0" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 637.022123] env[65680]: DEBUG nova.network.neutron [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 637.135190] env[65680]: DEBUG nova.network.neutron [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 637.417069] env[65680]: DEBUG nova.network.neutron [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Successfully updated port: 6fd47fa7-f60f-4555-b8ee-8bd5b78a3825 {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 637.435064] env[65680]: DEBUG oslo_concurrency.lockutils [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Acquiring lock "refresh_cache-40a7ee3c-8627-47f3-887e-31112586e799" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 637.435143] env[65680]: DEBUG oslo_concurrency.lockutils [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Acquired lock "refresh_cache-40a7ee3c-8627-47f3-887e-31112586e799" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 637.435299] env[65680]: DEBUG nova.network.neutron [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 637.542281] env[65680]: DEBUG nova.network.neutron [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 637.580107] env[65680]: DEBUG nova.compute.manager [req-5e5dec2a-e511-4ebf-992f-eb123bf97ab7 req-b664e11b-b13c-433a-8fbd-03e4812959f5 service nova] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Received event network-vif-plugged-ded655a7-5ab4-4feb-ab0c-65a6d60d802e {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 637.580435] env[65680]: DEBUG oslo_concurrency.lockutils [req-5e5dec2a-e511-4ebf-992f-eb123bf97ab7 req-b664e11b-b13c-433a-8fbd-03e4812959f5 service nova] Acquiring lock "f05204a0-268f-4d77-a2bf-cde4ee02915e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 637.581710] env[65680]: DEBUG oslo_concurrency.lockutils [req-5e5dec2a-e511-4ebf-992f-eb123bf97ab7 req-b664e11b-b13c-433a-8fbd-03e4812959f5 service nova] Lock "f05204a0-268f-4d77-a2bf-cde4ee02915e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 637.581710] env[65680]: DEBUG oslo_concurrency.lockutils [req-5e5dec2a-e511-4ebf-992f-eb123bf97ab7 req-b664e11b-b13c-433a-8fbd-03e4812959f5 service nova] Lock "f05204a0-268f-4d77-a2bf-cde4ee02915e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 637.581710] env[65680]: DEBUG nova.compute.manager [req-5e5dec2a-e511-4ebf-992f-eb123bf97ab7 req-b664e11b-b13c-433a-8fbd-03e4812959f5 service nova] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] No waiting events found dispatching network-vif-plugged-ded655a7-5ab4-4feb-ab0c-65a6d60d802e {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 637.581710] env[65680]: WARNING nova.compute.manager [req-5e5dec2a-e511-4ebf-992f-eb123bf97ab7 req-b664e11b-b13c-433a-8fbd-03e4812959f5 service nova] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Received unexpected event network-vif-plugged-ded655a7-5ab4-4feb-ab0c-65a6d60d802e for instance with vm_state building and task_state spawning. [ 637.755561] env[65680]: DEBUG nova.network.neutron [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Updating instance_info_cache with network_info: [{"id": "2e0079a8-bad7-4a86-a3b1-1aed3ef0a7c7", "address": "fa:16:3e:78:a5:4e", "network": {"id": "06dab409-1321-4c86-8c4a-1482bb5b6a56", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1030673501-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b5354c98e675493fbb00c885d0766ec5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77ccbd87-ecfd-4b2d-a1ea-29774addcef6", "external-id": "nsx-vlan-transportzone-385", "segmentation_id": 385, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2e0079a8-ba", "ovs_interfaceid": "2e0079a8-bad7-4a86-a3b1-1aed3ef0a7c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 637.768979] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Releasing lock "refresh_cache-f989cbee-9d5c-459f-b7a0-bf2259dadbb0" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 637.769460] env[65680]: DEBUG nova.compute.manager [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Instance network_info: |[{"id": "2e0079a8-bad7-4a86-a3b1-1aed3ef0a7c7", "address": "fa:16:3e:78:a5:4e", "network": {"id": "06dab409-1321-4c86-8c4a-1482bb5b6a56", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1030673501-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b5354c98e675493fbb00c885d0766ec5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77ccbd87-ecfd-4b2d-a1ea-29774addcef6", "external-id": "nsx-vlan-transportzone-385", "segmentation_id": 385, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2e0079a8-ba", "ovs_interfaceid": "2e0079a8-bad7-4a86-a3b1-1aed3ef0a7c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 637.771129] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:78:a5:4e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '77ccbd87-ecfd-4b2d-a1ea-29774addcef6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2e0079a8-bad7-4a86-a3b1-1aed3ef0a7c7', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 637.781062] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Creating folder: Project (b5354c98e675493fbb00c885d0766ec5). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 637.781062] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1bd9bc74-df49-4ebf-9c4c-894e40a64897 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.792899] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Created folder: Project (b5354c98e675493fbb00c885d0766ec5) in parent group-v572532. [ 637.792899] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Creating folder: Instances. Parent ref: group-v572568. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 637.793039] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e3970405-2e88-482f-8cc4-7a825c60b092 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.803479] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Created folder: Instances in parent group-v572568. [ 637.803479] env[65680]: DEBUG oslo.service.loopingcall [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 637.803479] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 637.803479] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-767e8aa7-ade4-4b97-a19d-5c2cafa3f36d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.826226] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 637.826226] env[65680]: value = "task-2847877" [ 637.826226] env[65680]: _type = "Task" [ 637.826226] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 637.841036] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847877, 'name': CreateVM_Task} progress is 5%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 637.894585] env[65680]: DEBUG oslo_concurrency.lockutils [None req-239cfc49-995f-461e-990f-bf1c33008cd1 tempest-ServersTestJSON-2079208081 tempest-ServersTestJSON-2079208081-project-member] Acquiring lock "3c728886-a983-4071-a728-25d87770556f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 637.894962] env[65680]: DEBUG oslo_concurrency.lockutils [None req-239cfc49-995f-461e-990f-bf1c33008cd1 tempest-ServersTestJSON-2079208081 tempest-ServersTestJSON-2079208081-project-member] Lock "3c728886-a983-4071-a728-25d87770556f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 638.282086] env[65680]: DEBUG nova.network.neutron [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Updating instance_info_cache with network_info: [{"id": "6fd47fa7-f60f-4555-b8ee-8bd5b78a3825", "address": "fa:16:3e:9c:ea:25", "network": {"id": "0a6b89fd-8e2e-4b6b-8800-75c8ff88ebf5", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-222818528-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "45cf99fd13fc4535802b72e3b07f301d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f1e0e39-0c84-4fcd-9113-cc528c3eb185", "external-id": "nsx-vlan-transportzone-907", "segmentation_id": 907, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6fd47fa7-f6", "ovs_interfaceid": "6fd47fa7-f60f-4555-b8ee-8bd5b78a3825", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 638.295148] env[65680]: DEBUG oslo_concurrency.lockutils [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Releasing lock "refresh_cache-40a7ee3c-8627-47f3-887e-31112586e799" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 638.295148] env[65680]: DEBUG nova.compute.manager [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Instance network_info: |[{"id": "6fd47fa7-f60f-4555-b8ee-8bd5b78a3825", "address": "fa:16:3e:9c:ea:25", "network": {"id": "0a6b89fd-8e2e-4b6b-8800-75c8ff88ebf5", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-222818528-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "45cf99fd13fc4535802b72e3b07f301d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f1e0e39-0c84-4fcd-9113-cc528c3eb185", "external-id": "nsx-vlan-transportzone-907", "segmentation_id": 907, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6fd47fa7-f6", "ovs_interfaceid": "6fd47fa7-f60f-4555-b8ee-8bd5b78a3825", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 638.295392] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9c:ea:25', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '9f1e0e39-0c84-4fcd-9113-cc528c3eb185', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6fd47fa7-f60f-4555-b8ee-8bd5b78a3825', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 638.304051] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Creating folder: Project (45cf99fd13fc4535802b72e3b07f301d). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 638.304051] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-505a49ca-6959-483d-be93-a8cc9b7ba19e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.315106] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Created folder: Project (45cf99fd13fc4535802b72e3b07f301d) in parent group-v572532. [ 638.315311] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Creating folder: Instances. Parent ref: group-v572571. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 638.315540] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9df161fe-b727-4e09-832c-63c944dedf6a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.325135] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Created folder: Instances in parent group-v572571. [ 638.325432] env[65680]: DEBUG oslo.service.loopingcall [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 638.325654] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 638.325886] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1db8eca7-9c68-41cb-970d-2ba45d021804 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.353313] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847877, 'name': CreateVM_Task, 'duration_secs': 0.298156} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 638.354149] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 638.354353] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 638.354353] env[65680]: value = "task-2847880" [ 638.354353] env[65680]: _type = "Task" [ 638.354353] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 638.355714] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 638.356091] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 638.356254] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 638.356499] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9d1bb6ed-af09-499f-b4b4-80c741dc02e1 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.372431] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847880, 'name': CreateVM_Task} progress is 6%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 638.372758] env[65680]: DEBUG oslo_vmware.api [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Waiting for the task: (returnval){ [ 638.372758] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]520ba7a0-ce65-6f45-311c-87ef9066c913" [ 638.372758] env[65680]: _type = "Task" [ 638.372758] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 638.381239] env[65680]: DEBUG oslo_vmware.api [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]520ba7a0-ce65-6f45-311c-87ef9066c913, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 638.462333] env[65680]: DEBUG nova.compute.manager [req-69e34a18-ee29-41ba-9f57-8f81b5292e4b req-a142e2f8-b31f-40e3-8dcd-4d6985698988 service nova] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Received event network-vif-plugged-2e0079a8-bad7-4a86-a3b1-1aed3ef0a7c7 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 638.462599] env[65680]: DEBUG oslo_concurrency.lockutils [req-69e34a18-ee29-41ba-9f57-8f81b5292e4b req-a142e2f8-b31f-40e3-8dcd-4d6985698988 service nova] Acquiring lock "f989cbee-9d5c-459f-b7a0-bf2259dadbb0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 638.462814] env[65680]: DEBUG oslo_concurrency.lockutils [req-69e34a18-ee29-41ba-9f57-8f81b5292e4b req-a142e2f8-b31f-40e3-8dcd-4d6985698988 service nova] Lock "f989cbee-9d5c-459f-b7a0-bf2259dadbb0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 638.462951] env[65680]: DEBUG oslo_concurrency.lockutils [req-69e34a18-ee29-41ba-9f57-8f81b5292e4b req-a142e2f8-b31f-40e3-8dcd-4d6985698988 service nova] Lock "f989cbee-9d5c-459f-b7a0-bf2259dadbb0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 638.463130] env[65680]: DEBUG nova.compute.manager [req-69e34a18-ee29-41ba-9f57-8f81b5292e4b req-a142e2f8-b31f-40e3-8dcd-4d6985698988 service nova] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] No waiting events found dispatching network-vif-plugged-2e0079a8-bad7-4a86-a3b1-1aed3ef0a7c7 {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 638.463287] env[65680]: WARNING nova.compute.manager [req-69e34a18-ee29-41ba-9f57-8f81b5292e4b req-a142e2f8-b31f-40e3-8dcd-4d6985698988 service nova] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Received unexpected event network-vif-plugged-2e0079a8-bad7-4a86-a3b1-1aed3ef0a7c7 for instance with vm_state building and task_state spawning. [ 638.868074] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847880, 'name': CreateVM_Task, 'duration_secs': 0.290534} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 638.868726] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 638.869497] env[65680]: DEBUG oslo_concurrency.lockutils [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 638.889964] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 638.893395] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 638.893395] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 638.893395] env[65680]: DEBUG oslo_concurrency.lockutils [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 638.893395] env[65680]: DEBUG oslo_concurrency.lockutils [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 638.893547] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4eca6c6f-b392-4433-a520-65deab9400d5 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.900032] env[65680]: DEBUG oslo_vmware.api [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Waiting for the task: (returnval){ [ 638.900032] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52b84ad6-3f8d-b527-2e20-c0459b371352" [ 638.900032] env[65680]: _type = "Task" [ 638.900032] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 638.909594] env[65680]: DEBUG oslo_vmware.api [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52b84ad6-3f8d-b527-2e20-c0459b371352, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 639.068642] env[65680]: DEBUG oslo_concurrency.lockutils [None req-46c28720-73a0-4f0c-8d38-e23ea117d331 tempest-ImagesTestJSON-24601272 tempest-ImagesTestJSON-24601272-project-member] Acquiring lock "ba875739-2ff0-4778-89cf-5b32f2ffe6fd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 639.068878] env[65680]: DEBUG oslo_concurrency.lockutils [None req-46c28720-73a0-4f0c-8d38-e23ea117d331 tempest-ImagesTestJSON-24601272 tempest-ImagesTestJSON-24601272-project-member] Lock "ba875739-2ff0-4778-89cf-5b32f2ffe6fd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 639.412978] env[65680]: DEBUG oslo_concurrency.lockutils [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 639.414810] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 639.414810] env[65680]: DEBUG oslo_concurrency.lockutils [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 640.464783] env[65680]: DEBUG nova.compute.manager [req-bfb2da04-b8f0-4664-b6c7-86e7ac249900 req-7eca526c-5e89-4475-aa0b-bef2f960484f service nova] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Received event network-changed-ded655a7-5ab4-4feb-ab0c-65a6d60d802e {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 640.464783] env[65680]: DEBUG nova.compute.manager [req-bfb2da04-b8f0-4664-b6c7-86e7ac249900 req-7eca526c-5e89-4475-aa0b-bef2f960484f service nova] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Refreshing instance network info cache due to event network-changed-ded655a7-5ab4-4feb-ab0c-65a6d60d802e. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 640.464783] env[65680]: DEBUG oslo_concurrency.lockutils [req-bfb2da04-b8f0-4664-b6c7-86e7ac249900 req-7eca526c-5e89-4475-aa0b-bef2f960484f service nova] Acquiring lock "refresh_cache-f05204a0-268f-4d77-a2bf-cde4ee02915e" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 640.465143] env[65680]: DEBUG oslo_concurrency.lockutils [req-bfb2da04-b8f0-4664-b6c7-86e7ac249900 req-7eca526c-5e89-4475-aa0b-bef2f960484f service nova] Acquired lock "refresh_cache-f05204a0-268f-4d77-a2bf-cde4ee02915e" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 640.465143] env[65680]: DEBUG nova.network.neutron [req-bfb2da04-b8f0-4664-b6c7-86e7ac249900 req-7eca526c-5e89-4475-aa0b-bef2f960484f service nova] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Refreshing network info cache for port ded655a7-5ab4-4feb-ab0c-65a6d60d802e {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 640.623852] env[65680]: DEBUG oslo_concurrency.lockutils [None req-248a8c74-caef-4a03-b1d6-3a4e541c0911 tempest-ListServersNegativeTestJSON-1391608803 tempest-ListServersNegativeTestJSON-1391608803-project-member] Acquiring lock "d1f6ea52-4367-4756-88ff-37830ce1aeba" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 640.623852] env[65680]: DEBUG oslo_concurrency.lockutils [None req-248a8c74-caef-4a03-b1d6-3a4e541c0911 tempest-ListServersNegativeTestJSON-1391608803 tempest-ListServersNegativeTestJSON-1391608803-project-member] Lock "d1f6ea52-4367-4756-88ff-37830ce1aeba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 640.652156] env[65680]: DEBUG oslo_concurrency.lockutils [None req-248a8c74-caef-4a03-b1d6-3a4e541c0911 tempest-ListServersNegativeTestJSON-1391608803 tempest-ListServersNegativeTestJSON-1391608803-project-member] Acquiring lock "11374639-ed45-4999-b8b9-fdbf08b9d8bd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 640.652508] env[65680]: DEBUG oslo_concurrency.lockutils [None req-248a8c74-caef-4a03-b1d6-3a4e541c0911 tempest-ListServersNegativeTestJSON-1391608803 tempest-ListServersNegativeTestJSON-1391608803-project-member] Lock "11374639-ed45-4999-b8b9-fdbf08b9d8bd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 640.690368] env[65680]: DEBUG oslo_concurrency.lockutils [None req-248a8c74-caef-4a03-b1d6-3a4e541c0911 tempest-ListServersNegativeTestJSON-1391608803 tempest-ListServersNegativeTestJSON-1391608803-project-member] Acquiring lock "a3876ce4-3e1d-4450-896c-b8321cc1a312" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 640.690742] env[65680]: DEBUG oslo_concurrency.lockutils [None req-248a8c74-caef-4a03-b1d6-3a4e541c0911 tempest-ListServersNegativeTestJSON-1391608803 tempest-ListServersNegativeTestJSON-1391608803-project-member] Lock "a3876ce4-3e1d-4450-896c-b8321cc1a312" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 641.202469] env[65680]: DEBUG nova.network.neutron [req-bfb2da04-b8f0-4664-b6c7-86e7ac249900 req-7eca526c-5e89-4475-aa0b-bef2f960484f service nova] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Updated VIF entry in instance network info cache for port ded655a7-5ab4-4feb-ab0c-65a6d60d802e. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 641.202469] env[65680]: DEBUG nova.network.neutron [req-bfb2da04-b8f0-4664-b6c7-86e7ac249900 req-7eca526c-5e89-4475-aa0b-bef2f960484f service nova] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Updating instance_info_cache with network_info: [{"id": "ded655a7-5ab4-4feb-ab0c-65a6d60d802e", "address": "fa:16:3e:26:7b:5b", "network": {"id": "0544c52d-341e-4305-b2f7-53638ebc0a5b", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1515093634-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ed86a3040c784758a96761fe3d24bd59", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "90878b7b-ddb7-4f47-892b-d6e06f73475f", "external-id": "nsx-vlan-transportzone-849", "segmentation_id": 849, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapded655a7-5a", "ovs_interfaceid": "ded655a7-5ab4-4feb-ab0c-65a6d60d802e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 641.213401] env[65680]: DEBUG oslo_concurrency.lockutils [req-bfb2da04-b8f0-4664-b6c7-86e7ac249900 req-7eca526c-5e89-4475-aa0b-bef2f960484f service nova] Releasing lock "refresh_cache-f05204a0-268f-4d77-a2bf-cde4ee02915e" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 641.865366] env[65680]: DEBUG nova.compute.manager [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Received event network-vif-plugged-6fd47fa7-f60f-4555-b8ee-8bd5b78a3825 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 641.865366] env[65680]: DEBUG oslo_concurrency.lockutils [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] Acquiring lock "40a7ee3c-8627-47f3-887e-31112586e799-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 641.865366] env[65680]: DEBUG oslo_concurrency.lockutils [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] Lock "40a7ee3c-8627-47f3-887e-31112586e799-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 641.865366] env[65680]: DEBUG oslo_concurrency.lockutils [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] Lock "40a7ee3c-8627-47f3-887e-31112586e799-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 641.865839] env[65680]: DEBUG nova.compute.manager [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] No waiting events found dispatching network-vif-plugged-6fd47fa7-f60f-4555-b8ee-8bd5b78a3825 {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 641.865839] env[65680]: WARNING nova.compute.manager [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Received unexpected event network-vif-plugged-6fd47fa7-f60f-4555-b8ee-8bd5b78a3825 for instance with vm_state building and task_state spawning. [ 641.865839] env[65680]: DEBUG nova.compute.manager [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Received event network-changed-2e0079a8-bad7-4a86-a3b1-1aed3ef0a7c7 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 641.865839] env[65680]: DEBUG nova.compute.manager [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Refreshing instance network info cache due to event network-changed-2e0079a8-bad7-4a86-a3b1-1aed3ef0a7c7. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 641.865839] env[65680]: DEBUG oslo_concurrency.lockutils [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] Acquiring lock "refresh_cache-f989cbee-9d5c-459f-b7a0-bf2259dadbb0" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 641.866350] env[65680]: DEBUG oslo_concurrency.lockutils [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] Acquired lock "refresh_cache-f989cbee-9d5c-459f-b7a0-bf2259dadbb0" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 641.866350] env[65680]: DEBUG nova.network.neutron [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Refreshing network info cache for port 2e0079a8-bad7-4a86-a3b1-1aed3ef0a7c7 {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 642.123662] env[65680]: DEBUG oslo_concurrency.lockutils [None req-6303a8ec-1583-4bc0-9dd9-67ad950255d0 tempest-ServersTestJSON-1264796280 tempest-ServersTestJSON-1264796280-project-member] Acquiring lock "b663e64f-77a5-4938-a492-df6f05bb182e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 642.124524] env[65680]: DEBUG oslo_concurrency.lockutils [None req-6303a8ec-1583-4bc0-9dd9-67ad950255d0 tempest-ServersTestJSON-1264796280 tempest-ServersTestJSON-1264796280-project-member] Lock "b663e64f-77a5-4938-a492-df6f05bb182e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 642.254708] env[65680]: DEBUG nova.network.neutron [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Updated VIF entry in instance network info cache for port 2e0079a8-bad7-4a86-a3b1-1aed3ef0a7c7. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 642.255053] env[65680]: DEBUG nova.network.neutron [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Updating instance_info_cache with network_info: [{"id": "2e0079a8-bad7-4a86-a3b1-1aed3ef0a7c7", "address": "fa:16:3e:78:a5:4e", "network": {"id": "06dab409-1321-4c86-8c4a-1482bb5b6a56", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1030673501-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b5354c98e675493fbb00c885d0766ec5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77ccbd87-ecfd-4b2d-a1ea-29774addcef6", "external-id": "nsx-vlan-transportzone-385", "segmentation_id": 385, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2e0079a8-ba", "ovs_interfaceid": "2e0079a8-bad7-4a86-a3b1-1aed3ef0a7c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 642.266183] env[65680]: DEBUG oslo_concurrency.lockutils [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] Releasing lock "refresh_cache-f989cbee-9d5c-459f-b7a0-bf2259dadbb0" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 642.266414] env[65680]: DEBUG nova.compute.manager [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Received event network-changed-6fd47fa7-f60f-4555-b8ee-8bd5b78a3825 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 642.266589] env[65680]: DEBUG nova.compute.manager [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Refreshing instance network info cache due to event network-changed-6fd47fa7-f60f-4555-b8ee-8bd5b78a3825. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 642.267036] env[65680]: DEBUG oslo_concurrency.lockutils [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] Acquiring lock "refresh_cache-40a7ee3c-8627-47f3-887e-31112586e799" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 642.267222] env[65680]: DEBUG oslo_concurrency.lockutils [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] Acquired lock "refresh_cache-40a7ee3c-8627-47f3-887e-31112586e799" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 642.267399] env[65680]: DEBUG nova.network.neutron [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Refreshing network info cache for port 6fd47fa7-f60f-4555-b8ee-8bd5b78a3825 {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 642.619451] env[65680]: DEBUG nova.network.neutron [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Updated VIF entry in instance network info cache for port 6fd47fa7-f60f-4555-b8ee-8bd5b78a3825. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 642.619874] env[65680]: DEBUG nova.network.neutron [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Updating instance_info_cache with network_info: [{"id": "6fd47fa7-f60f-4555-b8ee-8bd5b78a3825", "address": "fa:16:3e:9c:ea:25", "network": {"id": "0a6b89fd-8e2e-4b6b-8800-75c8ff88ebf5", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-222818528-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "45cf99fd13fc4535802b72e3b07f301d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f1e0e39-0c84-4fcd-9113-cc528c3eb185", "external-id": "nsx-vlan-transportzone-907", "segmentation_id": 907, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6fd47fa7-f6", "ovs_interfaceid": "6fd47fa7-f60f-4555-b8ee-8bd5b78a3825", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 642.629645] env[65680]: DEBUG oslo_concurrency.lockutils [req-9384e8e7-3101-416c-b358-a294fc93f3df req-9bcd7bd1-ce29-4577-b0de-d57d83fe56fb service nova] Releasing lock "refresh_cache-40a7ee3c-8627-47f3-887e-31112586e799" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 643.613756] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e8bed0cd-990f-4042-aa34-f54cfec813f7 tempest-AttachInterfacesTestJSON-1874593521 tempest-AttachInterfacesTestJSON-1874593521-project-member] Acquiring lock "c9022be0-026f-4f6a-b720-162cadcd76bb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 643.614071] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e8bed0cd-990f-4042-aa34-f54cfec813f7 tempest-AttachInterfacesTestJSON-1874593521 tempest-AttachInterfacesTestJSON-1874593521-project-member] Lock "c9022be0-026f-4f6a-b720-162cadcd76bb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 643.850716] env[65680]: DEBUG oslo_concurrency.lockutils [None req-22330e79-fe29-411a-8c0c-30be455b8072 tempest-ServerGroupTestJSON-1851943177 tempest-ServerGroupTestJSON-1851943177-project-member] Acquiring lock "ce17e81b-0291-4309-8594-28ea20c530a9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 643.850953] env[65680]: DEBUG oslo_concurrency.lockutils [None req-22330e79-fe29-411a-8c0c-30be455b8072 tempest-ServerGroupTestJSON-1851943177 tempest-ServerGroupTestJSON-1851943177-project-member] Lock "ce17e81b-0291-4309-8594-28ea20c530a9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 645.251810] env[65680]: DEBUG oslo_concurrency.lockutils [None req-eba879e1-ec5d-45a6-a716-21597e3f78a5 tempest-ServersTestMultiNic-905135208 tempest-ServersTestMultiNic-905135208-project-member] Acquiring lock "53010485-3888-4669-85b7-01381f0bffcd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 645.252166] env[65680]: DEBUG oslo_concurrency.lockutils [None req-eba879e1-ec5d-45a6-a716-21597e3f78a5 tempest-ServersTestMultiNic-905135208 tempest-ServersTestMultiNic-905135208-project-member] Lock "53010485-3888-4669-85b7-01381f0bffcd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 646.783769] env[65680]: WARNING oslo_vmware.rw_handles [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 646.783769] env[65680]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 646.783769] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 646.783769] env[65680]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 646.783769] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 646.783769] env[65680]: ERROR oslo_vmware.rw_handles response.begin() [ 646.783769] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 646.783769] env[65680]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 646.783769] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 646.783769] env[65680]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 646.783769] env[65680]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 646.783769] env[65680]: ERROR oslo_vmware.rw_handles [ 646.784372] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Downloaded image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to vmware_temp/ed4eb7ba-c7bb-49b1-9ab7-ba60f7c226ed/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 646.785791] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Caching image {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 646.788376] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Copying Virtual Disk [datastore1] vmware_temp/ed4eb7ba-c7bb-49b1-9ab7-ba60f7c226ed/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk to [datastore1] vmware_temp/ed4eb7ba-c7bb-49b1-9ab7-ba60f7c226ed/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk {{(pid=65680) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 646.789996] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c5f5b326-e094-4ed0-b063-19b02e09e745 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 646.800140] env[65680]: DEBUG oslo_vmware.api [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Waiting for the task: (returnval){ [ 646.800140] env[65680]: value = "task-2847881" [ 646.800140] env[65680]: _type = "Task" [ 646.800140] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 646.810271] env[65680]: DEBUG oslo_vmware.api [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Task: {'id': task-2847881, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 647.311819] env[65680]: DEBUG oslo_vmware.exceptions [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Fault InvalidArgument not matched. {{(pid=65680) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 647.312076] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 647.312633] env[65680]: ERROR nova.compute.manager [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 647.312633] env[65680]: Faults: ['InvalidArgument'] [ 647.312633] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Traceback (most recent call last): [ 647.312633] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 647.312633] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] yield resources [ 647.312633] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 647.312633] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] self.driver.spawn(context, instance, image_meta, [ 647.312633] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 647.312633] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 647.312633] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 647.312633] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] self._fetch_image_if_missing(context, vi) [ 647.312633] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 647.313131] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] image_cache(vi, tmp_image_ds_loc) [ 647.313131] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 647.313131] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] vm_util.copy_virtual_disk( [ 647.313131] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 647.313131] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] session._wait_for_task(vmdk_copy_task) [ 647.313131] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 647.313131] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] return self.wait_for_task(task_ref) [ 647.313131] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 647.313131] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] return evt.wait() [ 647.313131] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 647.313131] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] result = hub.switch() [ 647.313131] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 647.313131] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] return self.greenlet.switch() [ 647.313611] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 647.313611] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] self.f(*self.args, **self.kw) [ 647.313611] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 647.313611] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] raise exceptions.translate_fault(task_info.error) [ 647.313611] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 647.313611] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Faults: ['InvalidArgument'] [ 647.313611] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] [ 647.313611] env[65680]: INFO nova.compute.manager [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Terminating instance [ 647.314556] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 647.314661] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 647.315518] env[65680]: DEBUG nova.compute.manager [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 647.315708] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 647.315932] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-90cd2fd7-c88b-4efc-9511-f88dd587dfa4 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.318412] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-027f6f24-3e95-4361-bbba-43ef465bf8b9 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.327583] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 647.327583] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-341660a0-e9a9-4ba8-bfaa-21a48ca6f35b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.332112] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 647.332112] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 647.332736] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ac6cb46e-7d95-4fb8-b8ed-9eb467ae4e41 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.339360] env[65680]: DEBUG oslo_vmware.api [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Waiting for the task: (returnval){ [ 647.339360] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]529d0b1e-c159-e66b-249c-16c3f16631ab" [ 647.339360] env[65680]: _type = "Task" [ 647.339360] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 647.351423] env[65680]: DEBUG oslo_vmware.api [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]529d0b1e-c159-e66b-249c-16c3f16631ab, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 647.398077] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 647.398337] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 647.398483] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Deleting the datastore file [datastore1] a8b4f796-2893-4c05-be82-16a1bfd46db9 {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 647.398739] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-87e7dd3d-a1d5-4f0b-8b9d-d1fd7cb24ab6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.405026] env[65680]: DEBUG oslo_vmware.api [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Waiting for the task: (returnval){ [ 647.405026] env[65680]: value = "task-2847883" [ 647.405026] env[65680]: _type = "Task" [ 647.405026] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 647.848752] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 647.849012] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Creating directory with path [datastore1] vmware_temp/15277df7-7a55-4f9c-aefb-bb873c9c9601/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 647.849238] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-53eb4258-42e6-4b14-8b8b-6ff438d59203 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.861049] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Created directory with path [datastore1] vmware_temp/15277df7-7a55-4f9c-aefb-bb873c9c9601/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 647.861049] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Fetch image to [datastore1] vmware_temp/15277df7-7a55-4f9c-aefb-bb873c9c9601/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 647.861049] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/15277df7-7a55-4f9c-aefb-bb873c9c9601/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 647.861581] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-610fd8ed-eddb-4325-8a5a-307e0e3e0c75 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.867974] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4896b18-9bd7-493d-a23e-ba330ef65a38 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.876966] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c85976d-deda-4091-84fb-1e16c3c6bf84 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.913261] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-913d90fa-bd8d-4993-9db2-a22f1c82cceb {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.919550] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c53f2bd9-11e1-40e5-ac87-7de5ce841345 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.921253] env[65680]: DEBUG oslo_vmware.api [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Task: {'id': task-2847883, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081034} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 647.921467] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 647.921635] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 647.921794] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 647.921967] env[65680]: INFO nova.compute.manager [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Took 0.61 seconds to destroy the instance on the hypervisor. [ 647.923997] env[65680]: DEBUG nova.compute.claims [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 647.924173] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 647.924375] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 647.942660] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 647.996413] env[65680]: DEBUG oslo_vmware.rw_handles [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/15277df7-7a55-4f9c-aefb-bb873c9c9601/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 648.057000] env[65680]: DEBUG oslo_vmware.rw_handles [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Completed reading data from the image iterator. {{(pid=65680) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 648.057270] env[65680]: DEBUG oslo_vmware.rw_handles [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/15277df7-7a55-4f9c-aefb-bb873c9c9601/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 648.341967] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9485027-6461-4b00-95c9-5d0c7a9b0c28 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 648.349920] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6075dc3-325c-49e6-82f7-310eaac0cbfb {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 648.386840] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-491ed417-1c0e-4ce9-bf0a-b9612f941853 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 648.396173] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-101c0f3d-96c2-49c8-89e2-5d3359396109 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 648.410285] env[65680]: DEBUG nova.compute.provider_tree [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 648.418684] env[65680]: DEBUG nova.scheduler.client.report [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 648.434706] env[65680]: DEBUG oslo_concurrency.lockutils [None req-38a99a56-39aa-4a84-a86d-e5cbeac7b8cf tempest-ServerRescueTestJSONUnderV235-332028835 tempest-ServerRescueTestJSONUnderV235-332028835-project-member] Acquiring lock "017fddb0-49d5-434d-997d-126119a989ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 648.434706] env[65680]: DEBUG oslo_concurrency.lockutils [None req-38a99a56-39aa-4a84-a86d-e5cbeac7b8cf tempest-ServerRescueTestJSONUnderV235-332028835 tempest-ServerRescueTestJSONUnderV235-332028835-project-member] Lock "017fddb0-49d5-434d-997d-126119a989ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.003s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 648.438130] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.513s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 648.438130] env[65680]: ERROR nova.compute.manager [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 648.438130] env[65680]: Faults: ['InvalidArgument'] [ 648.438130] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Traceback (most recent call last): [ 648.438130] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 648.438130] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] self.driver.spawn(context, instance, image_meta, [ 648.438130] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 648.438130] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 648.438130] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 648.438130] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] self._fetch_image_if_missing(context, vi) [ 648.438775] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 648.438775] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] image_cache(vi, tmp_image_ds_loc) [ 648.438775] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 648.438775] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] vm_util.copy_virtual_disk( [ 648.438775] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 648.438775] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] session._wait_for_task(vmdk_copy_task) [ 648.438775] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 648.438775] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] return self.wait_for_task(task_ref) [ 648.438775] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 648.438775] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] return evt.wait() [ 648.438775] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 648.438775] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] result = hub.switch() [ 648.438775] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 648.439138] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] return self.greenlet.switch() [ 648.439138] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 648.439138] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] self.f(*self.args, **self.kw) [ 648.439138] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 648.439138] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] raise exceptions.translate_fault(task_info.error) [ 648.439138] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 648.439138] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Faults: ['InvalidArgument'] [ 648.439138] env[65680]: ERROR nova.compute.manager [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] [ 648.440450] env[65680]: DEBUG nova.compute.utils [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] VimFaultException {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 648.441491] env[65680]: DEBUG nova.compute.manager [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Build of instance a8b4f796-2893-4c05-be82-16a1bfd46db9 was re-scheduled: A specified parameter was not correct: fileType [ 648.441491] env[65680]: Faults: ['InvalidArgument'] {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 648.441965] env[65680]: DEBUG nova.compute.manager [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 648.442226] env[65680]: DEBUG nova.compute.manager [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 648.442451] env[65680]: DEBUG nova.compute.manager [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 648.442673] env[65680]: DEBUG nova.network.neutron [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 648.813170] env[65680]: DEBUG nova.network.neutron [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 648.829951] env[65680]: INFO nova.compute.manager [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] Took 0.39 seconds to deallocate network for instance. [ 648.928013] env[65680]: INFO nova.scheduler.client.report [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Deleted allocations for instance a8b4f796-2893-4c05-be82-16a1bfd46db9 [ 648.952562] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b26072e4-d6d5-4084-ae89-72614a4569f7 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Lock "a8b4f796-2893-4c05-be82-16a1bfd46db9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 101.213s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 648.953862] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "a8b4f796-2893-4c05-be82-16a1bfd46db9" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 90.477s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 648.954111] env[65680]: INFO nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: a8b4f796-2893-4c05-be82-16a1bfd46db9] During sync_power_state the instance has a pending task (spawning). Skip. [ 648.954300] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "a8b4f796-2893-4c05-be82-16a1bfd46db9" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 648.961878] env[65680]: DEBUG nova.compute.manager [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 649.014176] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 649.014443] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 649.015954] env[65680]: INFO nova.compute.claims [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 649.316260] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97c02319-49ff-4441-aa7e-aa0ec3767762 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 649.324758] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb3b8d5c-fc4b-449e-82d7-bb48b76755aa {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 649.353710] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41459bdc-ceb9-4c71-b6c5-1be9ff5ce2a8 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 649.360857] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3812a1a9-8f62-4097-bbf4-08a4c30f57f8 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 649.375652] env[65680]: DEBUG nova.compute.provider_tree [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 649.389020] env[65680]: DEBUG nova.scheduler.client.report [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 649.400870] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.386s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 649.401415] env[65680]: DEBUG nova.compute.manager [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 649.438016] env[65680]: DEBUG nova.compute.utils [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 649.438390] env[65680]: DEBUG nova.compute.manager [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 649.438659] env[65680]: DEBUG nova.network.neutron [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 649.447027] env[65680]: DEBUG nova.compute.manager [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 649.506095] env[65680]: DEBUG nova.compute.manager [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 649.516077] env[65680]: DEBUG nova.policy [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5aaeb0640d764701bf924e9823933c5e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '770a09a1c5964ef9a24ce7e8d7ea4146', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 649.528238] env[65680]: DEBUG nova.virt.hardware [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 649.528238] env[65680]: DEBUG nova.virt.hardware [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 649.528392] env[65680]: DEBUG nova.virt.hardware [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 649.529028] env[65680]: DEBUG nova.virt.hardware [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 649.529028] env[65680]: DEBUG nova.virt.hardware [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 649.529028] env[65680]: DEBUG nova.virt.hardware [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 649.529028] env[65680]: DEBUG nova.virt.hardware [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 649.529187] env[65680]: DEBUG nova.virt.hardware [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 649.529297] env[65680]: DEBUG nova.virt.hardware [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 649.529455] env[65680]: DEBUG nova.virt.hardware [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 649.529626] env[65680]: DEBUG nova.virt.hardware [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 649.530772] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95612091-32a0-42da-9e96-56f7a32d3454 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 649.538480] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0081684f-e03a-4c94-a13f-cea0b212a997 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 650.083329] env[65680]: DEBUG nova.network.neutron [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Successfully created port: 704ca6e3-8d5a-411b-9796-eb0201636d9c {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 650.807551] env[65680]: DEBUG oslo_concurrency.lockutils [None req-f37223fd-2de0-44a5-8ec5-d5bba1973458 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Acquiring lock "d7b468db-115d-4b24-b604-5edb176dbf96" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 650.807789] env[65680]: DEBUG oslo_concurrency.lockutils [None req-f37223fd-2de0-44a5-8ec5-d5bba1973458 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Lock "d7b468db-115d-4b24-b604-5edb176dbf96" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 650.878479] env[65680]: DEBUG nova.network.neutron [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Successfully updated port: 704ca6e3-8d5a-411b-9796-eb0201636d9c {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 650.888020] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Acquiring lock "refresh_cache-2f6ce1b8-d869-4219-851a-43ae3ddd3816" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 650.888178] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Acquired lock "refresh_cache-2f6ce1b8-d869-4219-851a-43ae3ddd3816" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 650.888327] env[65680]: DEBUG nova.network.neutron [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 650.945353] env[65680]: DEBUG nova.compute.manager [req-de3180c6-a415-401d-a202-9dfd3a72cc86 req-bbd49898-116e-4e04-a673-0d5bffa12532 service nova] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Received event network-vif-plugged-704ca6e3-8d5a-411b-9796-eb0201636d9c {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 650.945607] env[65680]: DEBUG oslo_concurrency.lockutils [req-de3180c6-a415-401d-a202-9dfd3a72cc86 req-bbd49898-116e-4e04-a673-0d5bffa12532 service nova] Acquiring lock "2f6ce1b8-d869-4219-851a-43ae3ddd3816-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 650.945876] env[65680]: DEBUG oslo_concurrency.lockutils [req-de3180c6-a415-401d-a202-9dfd3a72cc86 req-bbd49898-116e-4e04-a673-0d5bffa12532 service nova] Lock "2f6ce1b8-d869-4219-851a-43ae3ddd3816-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 650.945929] env[65680]: DEBUG oslo_concurrency.lockutils [req-de3180c6-a415-401d-a202-9dfd3a72cc86 req-bbd49898-116e-4e04-a673-0d5bffa12532 service nova] Lock "2f6ce1b8-d869-4219-851a-43ae3ddd3816-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 650.946091] env[65680]: DEBUG nova.compute.manager [req-de3180c6-a415-401d-a202-9dfd3a72cc86 req-bbd49898-116e-4e04-a673-0d5bffa12532 service nova] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] No waiting events found dispatching network-vif-plugged-704ca6e3-8d5a-411b-9796-eb0201636d9c {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 650.946312] env[65680]: WARNING nova.compute.manager [req-de3180c6-a415-401d-a202-9dfd3a72cc86 req-bbd49898-116e-4e04-a673-0d5bffa12532 service nova] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Received unexpected event network-vif-plugged-704ca6e3-8d5a-411b-9796-eb0201636d9c for instance with vm_state building and task_state spawning. [ 650.952405] env[65680]: DEBUG nova.network.neutron [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 651.163063] env[65680]: DEBUG nova.network.neutron [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Updating instance_info_cache with network_info: [{"id": "704ca6e3-8d5a-411b-9796-eb0201636d9c", "address": "fa:16:3e:75:a2:9c", "network": {"id": "2edb818a-c183-4490-aef6-896862dbf816", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-174619045-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "770a09a1c5964ef9a24ce7e8d7ea4146", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "571cdf48-7016-4715-8739-4cb70c90cd6d", "external-id": "nsx-vlan-transportzone-360", "segmentation_id": 360, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap704ca6e3-8d", "ovs_interfaceid": "704ca6e3-8d5a-411b-9796-eb0201636d9c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 651.173593] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Releasing lock "refresh_cache-2f6ce1b8-d869-4219-851a-43ae3ddd3816" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 651.173888] env[65680]: DEBUG nova.compute.manager [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Instance network_info: |[{"id": "704ca6e3-8d5a-411b-9796-eb0201636d9c", "address": "fa:16:3e:75:a2:9c", "network": {"id": "2edb818a-c183-4490-aef6-896862dbf816", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-174619045-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "770a09a1c5964ef9a24ce7e8d7ea4146", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "571cdf48-7016-4715-8739-4cb70c90cd6d", "external-id": "nsx-vlan-transportzone-360", "segmentation_id": 360, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap704ca6e3-8d", "ovs_interfaceid": "704ca6e3-8d5a-411b-9796-eb0201636d9c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 651.174410] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:75:a2:9c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '571cdf48-7016-4715-8739-4cb70c90cd6d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '704ca6e3-8d5a-411b-9796-eb0201636d9c', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 651.182248] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Creating folder: Project (770a09a1c5964ef9a24ce7e8d7ea4146). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 651.182726] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ff8aff0d-8a06-49a4-a388-e7751c8301db {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 651.193555] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Created folder: Project (770a09a1c5964ef9a24ce7e8d7ea4146) in parent group-v572532. [ 651.193731] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Creating folder: Instances. Parent ref: group-v572574. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 651.193937] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1e84af41-109b-4389-bdbc-e6a8220a0dbb {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 651.203087] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Created folder: Instances in parent group-v572574. [ 651.203357] env[65680]: DEBUG oslo.service.loopingcall [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 651.203485] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 651.203667] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f61ca907-101c-4cc1-a1be-fbd974cc0e6f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 651.224050] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 651.224050] env[65680]: value = "task-2847886" [ 651.224050] env[65680]: _type = "Task" [ 651.224050] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 651.230987] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847886, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 651.734325] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847886, 'name': CreateVM_Task, 'duration_secs': 0.311871} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 651.734501] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 651.735165] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 651.735329] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 651.735637] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 651.735868] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b2cb34db-1b0b-42b8-95fa-c8f892dccc41 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 651.740324] env[65680]: DEBUG oslo_vmware.api [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Waiting for the task: (returnval){ [ 651.740324] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52d7c3d2-dd38-2e96-6cb8-c6dfb4131713" [ 651.740324] env[65680]: _type = "Task" [ 651.740324] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 651.747645] env[65680]: DEBUG oslo_vmware.api [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52d7c3d2-dd38-2e96-6cb8-c6dfb4131713, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 652.250561] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 652.250854] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 652.251083] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 653.017637] env[65680]: DEBUG nova.compute.manager [req-15af5746-3620-4b65-bed3-a46a176ce11e req-8b01c0ca-3af8-47cd-899c-b9f52d05144c service nova] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Received event network-changed-704ca6e3-8d5a-411b-9796-eb0201636d9c {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 653.017863] env[65680]: DEBUG nova.compute.manager [req-15af5746-3620-4b65-bed3-a46a176ce11e req-8b01c0ca-3af8-47cd-899c-b9f52d05144c service nova] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Refreshing instance network info cache due to event network-changed-704ca6e3-8d5a-411b-9796-eb0201636d9c. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 653.018091] env[65680]: DEBUG oslo_concurrency.lockutils [req-15af5746-3620-4b65-bed3-a46a176ce11e req-8b01c0ca-3af8-47cd-899c-b9f52d05144c service nova] Acquiring lock "refresh_cache-2f6ce1b8-d869-4219-851a-43ae3ddd3816" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 653.018238] env[65680]: DEBUG oslo_concurrency.lockutils [req-15af5746-3620-4b65-bed3-a46a176ce11e req-8b01c0ca-3af8-47cd-899c-b9f52d05144c service nova] Acquired lock "refresh_cache-2f6ce1b8-d869-4219-851a-43ae3ddd3816" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 653.018392] env[65680]: DEBUG nova.network.neutron [req-15af5746-3620-4b65-bed3-a46a176ce11e req-8b01c0ca-3af8-47cd-899c-b9f52d05144c service nova] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Refreshing network info cache for port 704ca6e3-8d5a-411b-9796-eb0201636d9c {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 653.315979] env[65680]: DEBUG nova.network.neutron [req-15af5746-3620-4b65-bed3-a46a176ce11e req-8b01c0ca-3af8-47cd-899c-b9f52d05144c service nova] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Updated VIF entry in instance network info cache for port 704ca6e3-8d5a-411b-9796-eb0201636d9c. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 653.316353] env[65680]: DEBUG nova.network.neutron [req-15af5746-3620-4b65-bed3-a46a176ce11e req-8b01c0ca-3af8-47cd-899c-b9f52d05144c service nova] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Updating instance_info_cache with network_info: [{"id": "704ca6e3-8d5a-411b-9796-eb0201636d9c", "address": "fa:16:3e:75:a2:9c", "network": {"id": "2edb818a-c183-4490-aef6-896862dbf816", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-174619045-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "770a09a1c5964ef9a24ce7e8d7ea4146", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "571cdf48-7016-4715-8739-4cb70c90cd6d", "external-id": "nsx-vlan-transportzone-360", "segmentation_id": 360, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap704ca6e3-8d", "ovs_interfaceid": "704ca6e3-8d5a-411b-9796-eb0201636d9c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 653.326280] env[65680]: DEBUG oslo_concurrency.lockutils [req-15af5746-3620-4b65-bed3-a46a176ce11e req-8b01c0ca-3af8-47cd-899c-b9f52d05144c service nova] Releasing lock "refresh_cache-2f6ce1b8-d869-4219-851a-43ae3ddd3816" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 682.741066] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 683.288571] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 683.292186] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 683.292349] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Starting heal instance info cache {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 683.292498] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Rebuilding the list of instances to heal {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 683.312417] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 683.312614] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 683.312756] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 683.312884] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 683.313018] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 683.313142] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 683.313263] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 683.313385] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 683.313507] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 683.313626] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 683.313740] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Didn't find any instances for network info cache update. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 684.294669] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 684.294669] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 685.293415] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 685.293618] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 685.293778] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager.update_available_resource {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 685.305994] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 685.306244] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 685.306413] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 685.306578] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65680) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 685.307929] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c04e639-2eef-48f5-be12-2db3d5dcd9cd {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.316560] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47445ea4-cfbe-4a3c-8180-c4ebd6f81695 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.330611] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2695b968-0de7-4ae0-8b43-6f596e10aa0d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.337372] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6761fb1-913e-4f92-8cbc-25f1e6cf92ae {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.367923] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181059MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65680) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 685.368135] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 685.368318] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 685.431814] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 4bea49fd-7709-4aa8-86ac-c08ee943dd73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 685.431970] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance d98c190b-7d45-4e74-909d-75b38bfc6554 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 685.432114] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 685.432240] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 059f5688-3497-40bd-bf18-9c0748f3bdd6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 685.432362] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance fc14c935-fe84-4a49-ac1e-575e56b672a3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 685.432536] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance acbe2170-7ce3-4820-b082-6680e559bde1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 685.432684] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance f05204a0-268f-4d77-a2bf-cde4ee02915e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 685.432805] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance f989cbee-9d5c-459f-b7a0-bf2259dadbb0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 685.432920] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 40a7ee3c-8627-47f3-887e-31112586e799 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 685.433042] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 2f6ce1b8-d869-4219-851a-43ae3ddd3816 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 685.455583] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance b163d5b8-b01c-4ace-96e7-56276ab4ba82 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 685.478079] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance b935e1a7-1c77-4398-a964-cd7da312fc1b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 685.487362] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance cb739449-a329-41b8-964c-8c9db383e846 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 685.496314] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 3c728886-a983-4071-a728-25d87770556f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 685.505101] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance ba875739-2ff0-4778-89cf-5b32f2ffe6fd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 685.513550] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance d1f6ea52-4367-4756-88ff-37830ce1aeba has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 685.521994] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 11374639-ed45-4999-b8b9-fdbf08b9d8bd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 685.530524] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance a3876ce4-3e1d-4450-896c-b8321cc1a312 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 685.539094] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance b663e64f-77a5-4938-a492-df6f05bb182e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 685.547555] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance c9022be0-026f-4f6a-b720-162cadcd76bb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 685.556905] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance ce17e81b-0291-4309-8594-28ea20c530a9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 685.565250] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 53010485-3888-4669-85b7-01381f0bffcd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 685.573591] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 017fddb0-49d5-434d-997d-126119a989ef has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 685.581878] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance d7b468db-115d-4b24-b604-5edb176dbf96 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 685.582167] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 685.582267] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 685.865371] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5919f1de-c01c-4dbf-b903-617392a1ad00 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.873125] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-223b5918-219c-444e-a5a6-8493073d6fb7 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.906688] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f76a24d4-2052-4328-88d9-63dc62c3ce46 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.911538] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-765b92fc-6e36-4e39-ad8f-6be4c864d1a9 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.924746] env[65680]: DEBUG nova.compute.provider_tree [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 685.932998] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 685.947455] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65680) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 685.947655] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.579s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 686.946579] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 686.946919] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65680) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 696.802480] env[65680]: WARNING oslo_vmware.rw_handles [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 696.802480] env[65680]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 696.802480] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 696.802480] env[65680]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 696.802480] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 696.802480] env[65680]: ERROR oslo_vmware.rw_handles response.begin() [ 696.802480] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 696.802480] env[65680]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 696.802480] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 696.802480] env[65680]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 696.802480] env[65680]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 696.802480] env[65680]: ERROR oslo_vmware.rw_handles [ 696.803131] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Downloaded image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to vmware_temp/15277df7-7a55-4f9c-aefb-bb873c9c9601/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 696.804692] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Caching image {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 696.804959] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Copying Virtual Disk [datastore1] vmware_temp/15277df7-7a55-4f9c-aefb-bb873c9c9601/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk to [datastore1] vmware_temp/15277df7-7a55-4f9c-aefb-bb873c9c9601/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk {{(pid=65680) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 696.805297] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-86d0ebe9-a775-4333-bdc1-43e69e123ac2 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.814393] env[65680]: DEBUG oslo_vmware.api [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Waiting for the task: (returnval){ [ 696.814393] env[65680]: value = "task-2847887" [ 696.814393] env[65680]: _type = "Task" [ 696.814393] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 696.822196] env[65680]: DEBUG oslo_vmware.api [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Task: {'id': task-2847887, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 697.325205] env[65680]: DEBUG oslo_vmware.exceptions [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Fault InvalidArgument not matched. {{(pid=65680) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 697.325205] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 697.325627] env[65680]: ERROR nova.compute.manager [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 697.325627] env[65680]: Faults: ['InvalidArgument'] [ 697.325627] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Traceback (most recent call last): [ 697.325627] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 697.325627] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] yield resources [ 697.325627] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 697.325627] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] self.driver.spawn(context, instance, image_meta, [ 697.325627] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 697.325627] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] self._vmops.spawn(context, instance, image_meta, injected_files, [ 697.325627] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 697.325627] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] self._fetch_image_if_missing(context, vi) [ 697.325627] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 697.325981] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] image_cache(vi, tmp_image_ds_loc) [ 697.325981] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 697.325981] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] vm_util.copy_virtual_disk( [ 697.325981] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 697.325981] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] session._wait_for_task(vmdk_copy_task) [ 697.325981] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 697.325981] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] return self.wait_for_task(task_ref) [ 697.325981] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 697.325981] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] return evt.wait() [ 697.325981] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 697.325981] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] result = hub.switch() [ 697.325981] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 697.325981] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] return self.greenlet.switch() [ 697.326373] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 697.326373] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] self.f(*self.args, **self.kw) [ 697.326373] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 697.326373] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] raise exceptions.translate_fault(task_info.error) [ 697.326373] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 697.326373] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Faults: ['InvalidArgument'] [ 697.326373] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] [ 697.326373] env[65680]: INFO nova.compute.manager [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Terminating instance [ 697.327432] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 697.327634] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 697.327866] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-89aae777-eebf-47e6-a5ea-82055b91e7fe {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.330241] env[65680]: DEBUG nova.compute.manager [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 697.330479] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 697.331201] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8c3fc3c-dcfc-4b73-ac24-ae0b4ae19bdc {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.338376] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 697.338557] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b0e98cdf-f96b-4bfc-8f77-3612031adcd1 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.340785] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 697.340956] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 697.342038] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6bf8ce1c-42ff-409b-bf1d-8c73c0bf61dd {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.348897] env[65680]: DEBUG oslo_vmware.api [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Waiting for the task: (returnval){ [ 697.348897] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52661ecf-913f-3f28-a12b-54645cba63fe" [ 697.348897] env[65680]: _type = "Task" [ 697.348897] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 697.361475] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 697.361717] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Creating directory with path [datastore1] vmware_temp/a8c82ffe-f3f4-4fa1-b148-2d4abbb0ab3c/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 697.361925] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-69600b84-e7b4-42f1-aade-02911fcf6b3a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.384463] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Created directory with path [datastore1] vmware_temp/a8c82ffe-f3f4-4fa1-b148-2d4abbb0ab3c/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 697.384665] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Fetch image to [datastore1] vmware_temp/a8c82ffe-f3f4-4fa1-b148-2d4abbb0ab3c/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 697.384830] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/a8c82ffe-f3f4-4fa1-b148-2d4abbb0ab3c/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 697.385581] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e22afde-f562-4bd3-9d06-1a9b61c18ad0 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.394895] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-906ba32b-9aa0-4e58-aafa-a9641364c252 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.405356] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6970586e-4f12-4d19-be99-6007559f8254 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.438686] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d4900e9-1b0e-4d18-80cd-d771d40f2596 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.441209] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 697.441457] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 697.441722] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Deleting the datastore file [datastore1] 4bea49fd-7709-4aa8-86ac-c08ee943dd73 {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 697.441949] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5613c37b-591b-4676-b314-aa25cb031ec3 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.448412] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-22a0d00e-4e30-480e-b21f-3e8641c81c8d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.450590] env[65680]: DEBUG oslo_vmware.api [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Waiting for the task: (returnval){ [ 697.450590] env[65680]: value = "task-2847889" [ 697.450590] env[65680]: _type = "Task" [ 697.450590] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 697.457456] env[65680]: DEBUG oslo_vmware.api [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Task: {'id': task-2847889, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 697.535772] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 697.582330] env[65680]: DEBUG oslo_vmware.rw_handles [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a8c82ffe-f3f4-4fa1-b148-2d4abbb0ab3c/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 697.639482] env[65680]: DEBUG oslo_vmware.rw_handles [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Completed reading data from the image iterator. {{(pid=65680) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 697.639649] env[65680]: DEBUG oslo_vmware.rw_handles [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a8c82ffe-f3f4-4fa1-b148-2d4abbb0ab3c/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 697.960569] env[65680]: DEBUG oslo_vmware.api [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Task: {'id': task-2847889, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069304} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 697.960871] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 697.961064] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 697.961286] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 697.961502] env[65680]: INFO nova.compute.manager [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Took 0.63 seconds to destroy the instance on the hypervisor. [ 697.966079] env[65680]: DEBUG nova.compute.claims [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 697.966317] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 697.966584] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.266702] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16940fd8-7a79-4c51-8a5a-de77c1ed46a4 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.274675] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b7df7be-148e-47f6-adbc-e65da8d052c7 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.305372] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54aa4de7-ce72-4d3f-a67c-036ade9d7673 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.312715] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c88fa9eb-e682-4d75-9e9b-719ee809473f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.325539] env[65680]: DEBUG nova.compute.provider_tree [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 698.333620] env[65680]: DEBUG nova.scheduler.client.report [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 698.347796] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.381s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 698.348354] env[65680]: ERROR nova.compute.manager [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 698.348354] env[65680]: Faults: ['InvalidArgument'] [ 698.348354] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Traceback (most recent call last): [ 698.348354] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 698.348354] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] self.driver.spawn(context, instance, image_meta, [ 698.348354] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 698.348354] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] self._vmops.spawn(context, instance, image_meta, injected_files, [ 698.348354] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 698.348354] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] self._fetch_image_if_missing(context, vi) [ 698.348354] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 698.348354] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] image_cache(vi, tmp_image_ds_loc) [ 698.348354] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 698.348916] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] vm_util.copy_virtual_disk( [ 698.348916] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 698.348916] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] session._wait_for_task(vmdk_copy_task) [ 698.348916] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 698.348916] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] return self.wait_for_task(task_ref) [ 698.348916] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 698.348916] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] return evt.wait() [ 698.348916] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 698.348916] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] result = hub.switch() [ 698.348916] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 698.348916] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] return self.greenlet.switch() [ 698.348916] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 698.348916] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] self.f(*self.args, **self.kw) [ 698.349569] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 698.349569] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] raise exceptions.translate_fault(task_info.error) [ 698.349569] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 698.349569] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Faults: ['InvalidArgument'] [ 698.349569] env[65680]: ERROR nova.compute.manager [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] [ 698.349569] env[65680]: DEBUG nova.compute.utils [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] VimFaultException {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 698.350549] env[65680]: DEBUG nova.compute.manager [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Build of instance 4bea49fd-7709-4aa8-86ac-c08ee943dd73 was re-scheduled: A specified parameter was not correct: fileType [ 698.350549] env[65680]: Faults: ['InvalidArgument'] {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 698.350928] env[65680]: DEBUG nova.compute.manager [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 698.351148] env[65680]: DEBUG nova.compute.manager [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 698.351315] env[65680]: DEBUG nova.compute.manager [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 698.351477] env[65680]: DEBUG nova.network.neutron [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 698.690863] env[65680]: DEBUG nova.network.neutron [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 698.704070] env[65680]: INFO nova.compute.manager [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] Took 0.35 seconds to deallocate network for instance. [ 698.789933] env[65680]: INFO nova.scheduler.client.report [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Deleted allocations for instance 4bea49fd-7709-4aa8-86ac-c08ee943dd73 [ 698.808203] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c09d7bb2-aa2b-4915-b8c6-170bd3f4b049 tempest-TenantUsagesTestJSON-1938623030 tempest-TenantUsagesTestJSON-1938623030-project-member] Lock "4bea49fd-7709-4aa8-86ac-c08ee943dd73" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 150.429s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 698.809286] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "4bea49fd-7709-4aa8-86ac-c08ee943dd73" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 140.333s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.809468] env[65680]: INFO nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 4bea49fd-7709-4aa8-86ac-c08ee943dd73] During sync_power_state the instance has a pending task (spawning). Skip. [ 698.809641] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "4bea49fd-7709-4aa8-86ac-c08ee943dd73" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 698.834711] env[65680]: DEBUG nova.compute.manager [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 698.881182] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.881474] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.882971] env[65680]: INFO nova.compute.claims [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 699.200290] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdaa4f47-5fc5-4c39-97d9-61d40dd9aa8a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.208069] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a07ac57b-5842-44df-9958-ed1b2f447b1c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.237844] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71635be4-1026-4454-ad74-58a16bcce913 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.245055] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2f01113-7314-495a-bf85-6ea10fe6174d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.258182] env[65680]: DEBUG nova.compute.provider_tree [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 699.266483] env[65680]: DEBUG nova.scheduler.client.report [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 699.280511] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.399s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 699.280805] env[65680]: DEBUG nova.compute.manager [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 699.312793] env[65680]: DEBUG nova.compute.utils [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 699.314567] env[65680]: DEBUG nova.compute.manager [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 699.314750] env[65680]: DEBUG nova.network.neutron [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 699.323212] env[65680]: DEBUG nova.compute.manager [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 699.381490] env[65680]: DEBUG nova.policy [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6c08cf5bff684dae8e46257b50e0e675', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '312fb656d39849b1accd526dd2f06644', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 699.390905] env[65680]: DEBUG nova.compute.manager [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 699.409429] env[65680]: DEBUG nova.virt.hardware [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 699.409648] env[65680]: DEBUG nova.virt.hardware [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 699.409648] env[65680]: DEBUG nova.virt.hardware [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 699.410221] env[65680]: DEBUG nova.virt.hardware [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 699.410221] env[65680]: DEBUG nova.virt.hardware [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 699.410221] env[65680]: DEBUG nova.virt.hardware [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 699.410424] env[65680]: DEBUG nova.virt.hardware [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 699.410455] env[65680]: DEBUG nova.virt.hardware [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 699.412565] env[65680]: DEBUG nova.virt.hardware [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 699.412565] env[65680]: DEBUG nova.virt.hardware [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 699.412565] env[65680]: DEBUG nova.virt.hardware [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 699.412565] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e7daef9-5956-444a-853d-087783767c6d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.423020] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8788990a-c1b6-4c24-99ac-0eaebecdb9e6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.735220] env[65680]: DEBUG nova.network.neutron [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Successfully created port: 94344f23-cda8-41df-a212-f809024b4ac3 {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 700.496236] env[65680]: DEBUG nova.network.neutron [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Successfully updated port: 94344f23-cda8-41df-a212-f809024b4ac3 {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 700.507734] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Acquiring lock "refresh_cache-b163d5b8-b01c-4ace-96e7-56276ab4ba82" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 700.507734] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Acquired lock "refresh_cache-b163d5b8-b01c-4ace-96e7-56276ab4ba82" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 700.507734] env[65680]: DEBUG nova.network.neutron [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 700.557374] env[65680]: DEBUG nova.network.neutron [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 700.767282] env[65680]: DEBUG nova.network.neutron [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Updating instance_info_cache with network_info: [{"id": "94344f23-cda8-41df-a212-f809024b4ac3", "address": "fa:16:3e:6d:db:f6", "network": {"id": "b324acb4-9406-47af-ac27-ade2049809bf", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-2068470433-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "312fb656d39849b1accd526dd2f06644", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "93c5b7ce-4c84-40bc-884c-b2453e0eee69", "external-id": "nsx-vlan-transportzone-882", "segmentation_id": 882, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap94344f23-cd", "ovs_interfaceid": "94344f23-cda8-41df-a212-f809024b4ac3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 700.778388] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Releasing lock "refresh_cache-b163d5b8-b01c-4ace-96e7-56276ab4ba82" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 700.778761] env[65680]: DEBUG nova.compute.manager [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Instance network_info: |[{"id": "94344f23-cda8-41df-a212-f809024b4ac3", "address": "fa:16:3e:6d:db:f6", "network": {"id": "b324acb4-9406-47af-ac27-ade2049809bf", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-2068470433-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "312fb656d39849b1accd526dd2f06644", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "93c5b7ce-4c84-40bc-884c-b2453e0eee69", "external-id": "nsx-vlan-transportzone-882", "segmentation_id": 882, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap94344f23-cd", "ovs_interfaceid": "94344f23-cda8-41df-a212-f809024b4ac3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 700.779150] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6d:db:f6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '93c5b7ce-4c84-40bc-884c-b2453e0eee69', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '94344f23-cda8-41df-a212-f809024b4ac3', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 700.786526] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Creating folder: Project (312fb656d39849b1accd526dd2f06644). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 700.788150] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9bfb75d1-250d-4b2d-9a3e-2d7b127da102 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 700.790779] env[65680]: DEBUG nova.compute.manager [req-c05bc8a5-2252-439b-a0c9-358bd2cfcb3c req-5e6f035a-7fe1-4a1d-aea6-e9b434d96058 service nova] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Received event network-vif-plugged-94344f23-cda8-41df-a212-f809024b4ac3 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 700.790978] env[65680]: DEBUG oslo_concurrency.lockutils [req-c05bc8a5-2252-439b-a0c9-358bd2cfcb3c req-5e6f035a-7fe1-4a1d-aea6-e9b434d96058 service nova] Acquiring lock "b163d5b8-b01c-4ace-96e7-56276ab4ba82-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.791296] env[65680]: DEBUG oslo_concurrency.lockutils [req-c05bc8a5-2252-439b-a0c9-358bd2cfcb3c req-5e6f035a-7fe1-4a1d-aea6-e9b434d96058 service nova] Lock "b163d5b8-b01c-4ace-96e7-56276ab4ba82-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.791512] env[65680]: DEBUG oslo_concurrency.lockutils [req-c05bc8a5-2252-439b-a0c9-358bd2cfcb3c req-5e6f035a-7fe1-4a1d-aea6-e9b434d96058 service nova] Lock "b163d5b8-b01c-4ace-96e7-56276ab4ba82-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.791687] env[65680]: DEBUG nova.compute.manager [req-c05bc8a5-2252-439b-a0c9-358bd2cfcb3c req-5e6f035a-7fe1-4a1d-aea6-e9b434d96058 service nova] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] No waiting events found dispatching network-vif-plugged-94344f23-cda8-41df-a212-f809024b4ac3 {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 700.792933] env[65680]: WARNING nova.compute.manager [req-c05bc8a5-2252-439b-a0c9-358bd2cfcb3c req-5e6f035a-7fe1-4a1d-aea6-e9b434d96058 service nova] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Received unexpected event network-vif-plugged-94344f23-cda8-41df-a212-f809024b4ac3 for instance with vm_state building and task_state spawning. [ 700.793174] env[65680]: DEBUG nova.compute.manager [req-c05bc8a5-2252-439b-a0c9-358bd2cfcb3c req-5e6f035a-7fe1-4a1d-aea6-e9b434d96058 service nova] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Received event network-changed-94344f23-cda8-41df-a212-f809024b4ac3 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 700.793347] env[65680]: DEBUG nova.compute.manager [req-c05bc8a5-2252-439b-a0c9-358bd2cfcb3c req-5e6f035a-7fe1-4a1d-aea6-e9b434d96058 service nova] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Refreshing instance network info cache due to event network-changed-94344f23-cda8-41df-a212-f809024b4ac3. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 700.793540] env[65680]: DEBUG oslo_concurrency.lockutils [req-c05bc8a5-2252-439b-a0c9-358bd2cfcb3c req-5e6f035a-7fe1-4a1d-aea6-e9b434d96058 service nova] Acquiring lock "refresh_cache-b163d5b8-b01c-4ace-96e7-56276ab4ba82" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 700.793676] env[65680]: DEBUG oslo_concurrency.lockutils [req-c05bc8a5-2252-439b-a0c9-358bd2cfcb3c req-5e6f035a-7fe1-4a1d-aea6-e9b434d96058 service nova] Acquired lock "refresh_cache-b163d5b8-b01c-4ace-96e7-56276ab4ba82" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 700.793824] env[65680]: DEBUG nova.network.neutron [req-c05bc8a5-2252-439b-a0c9-358bd2cfcb3c req-5e6f035a-7fe1-4a1d-aea6-e9b434d96058 service nova] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Refreshing network info cache for port 94344f23-cda8-41df-a212-f809024b4ac3 {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 700.807047] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Created folder: Project (312fb656d39849b1accd526dd2f06644) in parent group-v572532. [ 700.807151] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Creating folder: Instances. Parent ref: group-v572577. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 700.808303] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5657e21c-d319-4182-8918-9ff7af56a83e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 700.817763] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Created folder: Instances in parent group-v572577. [ 700.817991] env[65680]: DEBUG oslo.service.loopingcall [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 700.818187] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 700.818427] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-708ebf22-6b17-4141-90f9-5f15eb79583b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 700.837878] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 700.837878] env[65680]: value = "task-2847892" [ 700.837878] env[65680]: _type = "Task" [ 700.837878] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 700.845152] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847892, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 701.154202] env[65680]: DEBUG nova.network.neutron [req-c05bc8a5-2252-439b-a0c9-358bd2cfcb3c req-5e6f035a-7fe1-4a1d-aea6-e9b434d96058 service nova] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Updated VIF entry in instance network info cache for port 94344f23-cda8-41df-a212-f809024b4ac3. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 701.154655] env[65680]: DEBUG nova.network.neutron [req-c05bc8a5-2252-439b-a0c9-358bd2cfcb3c req-5e6f035a-7fe1-4a1d-aea6-e9b434d96058 service nova] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Updating instance_info_cache with network_info: [{"id": "94344f23-cda8-41df-a212-f809024b4ac3", "address": "fa:16:3e:6d:db:f6", "network": {"id": "b324acb4-9406-47af-ac27-ade2049809bf", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-2068470433-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "312fb656d39849b1accd526dd2f06644", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "93c5b7ce-4c84-40bc-884c-b2453e0eee69", "external-id": "nsx-vlan-transportzone-882", "segmentation_id": 882, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap94344f23-cd", "ovs_interfaceid": "94344f23-cda8-41df-a212-f809024b4ac3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 701.164135] env[65680]: DEBUG oslo_concurrency.lockutils [req-c05bc8a5-2252-439b-a0c9-358bd2cfcb3c req-5e6f035a-7fe1-4a1d-aea6-e9b434d96058 service nova] Releasing lock "refresh_cache-b163d5b8-b01c-4ace-96e7-56276ab4ba82" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 701.348524] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847892, 'name': CreateVM_Task, 'duration_secs': 0.313963} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 701.348865] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 701.352019] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 701.352019] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 701.352019] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 701.352019] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-86861833-c651-4562-9f85-8ff4201e0b4d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.357534] env[65680]: DEBUG oslo_vmware.api [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Waiting for the task: (returnval){ [ 701.357534] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]5210c79e-0c42-5382-8fd5-b0cd495486ca" [ 701.357534] env[65680]: _type = "Task" [ 701.357534] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 701.371508] env[65680]: DEBUG oslo_vmware.api [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]5210c79e-0c42-5382-8fd5-b0cd495486ca, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 701.870606] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 701.870966] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 701.871315] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 742.295098] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 743.289112] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 744.292615] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 744.292944] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Starting heal instance info cache {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 744.292944] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Rebuilding the list of instances to heal {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 744.315107] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 744.315307] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 744.315445] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 744.315574] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 744.315700] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 744.315824] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 744.315947] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 744.316089] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 744.316216] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 744.316334] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 744.316452] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Didn't find any instances for network info cache update. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 744.316900] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 745.292767] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 745.315132] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 745.315330] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 745.315697] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager.update_available_resource {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 745.324622] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 745.324840] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 745.325016] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 745.325172] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65680) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 745.326186] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98b66347-9027-4199-8b02-490c27223f58 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.335196] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b9d8f01-55ad-4a64-861e-4a2378f03da6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.349997] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0ef009b-b4ff-470e-ab1a-ed229fb9e170 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.356080] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5c5a08f-45d1-49fd-9e60-474d26814eee {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.384071] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181058MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65680) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 745.384211] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 745.384389] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 745.448688] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance d98c190b-7d45-4e74-909d-75b38bfc6554 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 745.448920] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 745.449089] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 059f5688-3497-40bd-bf18-9c0748f3bdd6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 745.449217] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance fc14c935-fe84-4a49-ac1e-575e56b672a3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 745.449337] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance acbe2170-7ce3-4820-b082-6680e559bde1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 745.449466] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance f05204a0-268f-4d77-a2bf-cde4ee02915e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 745.449586] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance f989cbee-9d5c-459f-b7a0-bf2259dadbb0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 745.449703] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 40a7ee3c-8627-47f3-887e-31112586e799 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 745.449818] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 2f6ce1b8-d869-4219-851a-43ae3ddd3816 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 745.449932] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance b163d5b8-b01c-4ace-96e7-56276ab4ba82 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 745.460988] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance b935e1a7-1c77-4398-a964-cd7da312fc1b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 745.471564] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance cb739449-a329-41b8-964c-8c9db383e846 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 745.481502] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 3c728886-a983-4071-a728-25d87770556f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 745.491231] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance ba875739-2ff0-4778-89cf-5b32f2ffe6fd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 745.500260] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance d1f6ea52-4367-4756-88ff-37830ce1aeba has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 745.509205] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 11374639-ed45-4999-b8b9-fdbf08b9d8bd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 745.519467] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance a3876ce4-3e1d-4450-896c-b8321cc1a312 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 745.529850] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance b663e64f-77a5-4938-a492-df6f05bb182e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 745.538390] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance c9022be0-026f-4f6a-b720-162cadcd76bb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 745.548041] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance ce17e81b-0291-4309-8594-28ea20c530a9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 745.558540] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 53010485-3888-4669-85b7-01381f0bffcd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 745.568078] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 017fddb0-49d5-434d-997d-126119a989ef has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 745.581083] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance d7b468db-115d-4b24-b604-5edb176dbf96 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 745.581332] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 745.581478] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 745.843990] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19a66f97-c213-40b6-922d-485d1d3d1463 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.852941] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7f6079b-9ae4-4ff4-8c0c-f5c793b73e53 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.882362] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b73db438-e512-4dda-8b97-0ce55d05da56 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.889728] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1198f573-3067-4a46-95f5-038050f4f30f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.902594] env[65680]: DEBUG nova.compute.provider_tree [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 745.910448] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 745.925657] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65680) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 745.925742] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.541s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 746.651096] env[65680]: WARNING oslo_vmware.rw_handles [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 746.651096] env[65680]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 746.651096] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 746.651096] env[65680]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 746.651096] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 746.651096] env[65680]: ERROR oslo_vmware.rw_handles response.begin() [ 746.651096] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 746.651096] env[65680]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 746.651096] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 746.651096] env[65680]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 746.651096] env[65680]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 746.651096] env[65680]: ERROR oslo_vmware.rw_handles [ 746.651673] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Downloaded image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to vmware_temp/a8c82ffe-f3f4-4fa1-b148-2d4abbb0ab3c/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 746.653238] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Caching image {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 746.653487] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Copying Virtual Disk [datastore1] vmware_temp/a8c82ffe-f3f4-4fa1-b148-2d4abbb0ab3c/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk to [datastore1] vmware_temp/a8c82ffe-f3f4-4fa1-b148-2d4abbb0ab3c/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk {{(pid=65680) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 746.653781] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a54d3629-7dd9-421e-909e-d6ee7ff1be4b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 746.661432] env[65680]: DEBUG oslo_vmware.api [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Waiting for the task: (returnval){ [ 746.661432] env[65680]: value = "task-2847893" [ 746.661432] env[65680]: _type = "Task" [ 746.661432] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 746.671544] env[65680]: DEBUG oslo_vmware.api [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Task: {'id': task-2847893, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 747.057249] env[65680]: DEBUG nova.compute.manager [req-236661ed-3dab-4308-9528-c8b8f5227868 req-7e18e892-eeea-49d0-8347-978c31f9c24e service nova] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Received event network-vif-deleted-50897877-7974-45f4-be58-52d3c88d26c1 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 747.177105] env[65680]: DEBUG oslo_vmware.exceptions [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Fault InvalidArgument not matched. {{(pid=65680) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 747.177105] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 747.177105] env[65680]: ERROR nova.compute.manager [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 747.177105] env[65680]: Faults: ['InvalidArgument'] [ 747.177105] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Traceback (most recent call last): [ 747.177105] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 747.177105] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] yield resources [ 747.177105] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 747.177105] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] self.driver.spawn(context, instance, image_meta, [ 747.177661] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 747.177661] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] self._vmops.spawn(context, instance, image_meta, injected_files, [ 747.177661] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 747.177661] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] self._fetch_image_if_missing(context, vi) [ 747.177661] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 747.177661] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] image_cache(vi, tmp_image_ds_loc) [ 747.177661] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 747.177661] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] vm_util.copy_virtual_disk( [ 747.177661] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 747.177661] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] session._wait_for_task(vmdk_copy_task) [ 747.177661] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 747.177661] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] return self.wait_for_task(task_ref) [ 747.177661] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 747.178065] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] return evt.wait() [ 747.178065] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 747.178065] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] result = hub.switch() [ 747.178065] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 747.178065] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] return self.greenlet.switch() [ 747.178065] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 747.178065] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] self.f(*self.args, **self.kw) [ 747.178065] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 747.178065] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] raise exceptions.translate_fault(task_info.error) [ 747.178065] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 747.178065] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Faults: ['InvalidArgument'] [ 747.178065] env[65680]: ERROR nova.compute.manager [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] [ 747.178065] env[65680]: INFO nova.compute.manager [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Terminating instance [ 747.178709] env[65680]: DEBUG oslo_concurrency.lockutils [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 747.178764] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 747.180625] env[65680]: DEBUG nova.compute.manager [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 747.180625] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 747.180625] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-53b2f7be-4aee-4cf4-8bea-3ef8b0f0df50 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.182729] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cabaa88-2edc-4eea-8e9f-6014a2e88641 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.189975] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 747.190207] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e3857102-ce8d-4ec5-a2bc-41242189c279 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.192643] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 747.192813] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 747.193766] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d2976864-b24c-467a-86a1-5cdadd60fc28 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.200010] env[65680]: DEBUG oslo_vmware.api [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Waiting for the task: (returnval){ [ 747.200010] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52215ced-e628-d5f3-46cc-1c776531cf05" [ 747.200010] env[65680]: _type = "Task" [ 747.200010] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 747.208090] env[65680]: DEBUG oslo_vmware.api [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52215ced-e628-d5f3-46cc-1c776531cf05, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 747.270948] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 747.271240] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 747.271427] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Deleting the datastore file [datastore1] d98c190b-7d45-4e74-909d-75b38bfc6554 {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 747.271695] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5b6082e3-1d89-4230-8eb7-78700d1a7472 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.278192] env[65680]: DEBUG oslo_vmware.api [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Waiting for the task: (returnval){ [ 747.278192] env[65680]: value = "task-2847895" [ 747.278192] env[65680]: _type = "Task" [ 747.278192] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 747.290554] env[65680]: DEBUG oslo_vmware.api [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Task: {'id': task-2847895, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 747.710507] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 747.710779] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Creating directory with path [datastore1] vmware_temp/0a273832-7eed-4f9f-a942-09bb4938ebad/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 747.711096] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6341d7d2-32ea-4547-8d49-9987f3157be2 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.722679] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Created directory with path [datastore1] vmware_temp/0a273832-7eed-4f9f-a942-09bb4938ebad/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 747.722871] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Fetch image to [datastore1] vmware_temp/0a273832-7eed-4f9f-a942-09bb4938ebad/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 747.723047] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/0a273832-7eed-4f9f-a942-09bb4938ebad/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 747.723782] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-551f6471-3912-4a6a-9a79-ca4d63200ae0 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.734339] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-041d2b63-1ce4-49c1-9ef5-e0e7f01950e7 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.743531] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fb97d9b-38e1-4b06-9e47-dfc363e7fe0c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.775210] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23a71951-2c8b-4629-8bc0-faab89e89124 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.784980] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-edba8f17-d86b-4b00-92a2-0f75e9270779 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.788963] env[65680]: DEBUG oslo_vmware.api [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Task: {'id': task-2847895, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075292} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 747.789568] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 747.789785] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 747.790024] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 747.790211] env[65680]: INFO nova.compute.manager [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Took 0.61 seconds to destroy the instance on the hypervisor. [ 747.792375] env[65680]: DEBUG nova.compute.claims [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 747.792765] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 747.792765] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 747.809216] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 747.827209] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.034s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 747.831021] env[65680]: DEBUG nova.compute.utils [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Instance d98c190b-7d45-4e74-909d-75b38bfc6554 could not be found. {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 747.831287] env[65680]: DEBUG nova.compute.manager [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Instance disappeared during build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 747.831672] env[65680]: DEBUG nova.compute.manager [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 747.831897] env[65680]: DEBUG nova.compute.manager [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 747.832191] env[65680]: DEBUG nova.compute.manager [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 747.832394] env[65680]: DEBUG nova.network.neutron [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 747.885448] env[65680]: DEBUG oslo_vmware.rw_handles [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0a273832-7eed-4f9f-a942-09bb4938ebad/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 747.940942] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 747.945294] env[65680]: DEBUG oslo_vmware.rw_handles [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Completed reading data from the image iterator. {{(pid=65680) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 747.945615] env[65680]: DEBUG oslo_vmware.rw_handles [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0a273832-7eed-4f9f-a942-09bb4938ebad/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 747.989853] env[65680]: DEBUG nova.network.neutron [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 747.999651] env[65680]: INFO nova.compute.manager [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Took 0.17 seconds to deallocate network for instance. [ 748.057144] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c8d681d1-9b3f-4ce6-ad0a-ca1b88e9f85c tempest-InstanceActionsV221TestJSON-146167596 tempest-InstanceActionsV221TestJSON-146167596-project-member] Lock "d98c190b-7d45-4e74-909d-75b38bfc6554" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.442s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 748.058584] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "d98c190b-7d45-4e74-909d-75b38bfc6554" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 189.582s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 748.058781] env[65680]: INFO nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] During sync_power_state the instance has a pending task (spawning). Skip. [ 748.059064] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "d98c190b-7d45-4e74-909d-75b38bfc6554" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 748.071023] env[65680]: DEBUG nova.compute.manager [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 748.129766] env[65680]: DEBUG oslo_concurrency.lockutils [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 748.130029] env[65680]: DEBUG oslo_concurrency.lockutils [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 748.131509] env[65680]: INFO nova.compute.claims [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 748.294216] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 748.294390] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65680) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 748.469175] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e1e6a0d-b64e-4fa8-88ac-70593855438f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.476787] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d69ed42-677b-4cf2-856a-3888d0c00fdc {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.506089] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78962177-f191-4d81-9858-5188c7336cd7 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.513464] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc70a46c-3083-4d45-8f73-49c0066da123 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.526383] env[65680]: DEBUG nova.compute.provider_tree [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 748.536325] env[65680]: DEBUG nova.scheduler.client.report [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 748.550062] env[65680]: DEBUG oslo_concurrency.lockutils [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.420s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 748.550474] env[65680]: DEBUG nova.compute.manager [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 748.609806] env[65680]: DEBUG nova.compute.utils [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 748.612080] env[65680]: DEBUG nova.compute.manager [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 748.612427] env[65680]: DEBUG nova.network.neutron [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 748.627025] env[65680]: DEBUG nova.compute.manager [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 748.673362] env[65680]: DEBUG nova.policy [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '68d5fafd6fd7456aa8318e52d8858249', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '30fcdee2e31a4e94971b12026ac1f3e4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 748.701019] env[65680]: DEBUG nova.compute.manager [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 748.728090] env[65680]: DEBUG nova.virt.hardware [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 748.728090] env[65680]: DEBUG nova.virt.hardware [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 748.728090] env[65680]: DEBUG nova.virt.hardware [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 748.728405] env[65680]: DEBUG nova.virt.hardware [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 748.728405] env[65680]: DEBUG nova.virt.hardware [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 748.728405] env[65680]: DEBUG nova.virt.hardware [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 748.728405] env[65680]: DEBUG nova.virt.hardware [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 748.728405] env[65680]: DEBUG nova.virt.hardware [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 748.728551] env[65680]: DEBUG nova.virt.hardware [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 748.728551] env[65680]: DEBUG nova.virt.hardware [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 748.728551] env[65680]: DEBUG nova.virt.hardware [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 748.733276] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6690479b-ca56-49dd-9afa-29b3591e25a0 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.739800] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54864c3e-8d5f-4f11-9633-ec2e2ad99e74 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 749.148269] env[65680]: DEBUG nova.network.neutron [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Successfully created port: 74d49725-9616-4fce-9264-7c6d80f19f05 {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 750.095161] env[65680]: DEBUG nova.compute.manager [req-0687f47b-d1f5-442c-b4a6-4fad0a023179 req-e4b0a711-71b2-417f-9a89-e69a5b8e0339 service nova] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Received event network-vif-plugged-74d49725-9616-4fce-9264-7c6d80f19f05 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 750.095161] env[65680]: DEBUG oslo_concurrency.lockutils [req-0687f47b-d1f5-442c-b4a6-4fad0a023179 req-e4b0a711-71b2-417f-9a89-e69a5b8e0339 service nova] Acquiring lock "b935e1a7-1c77-4398-a964-cd7da312fc1b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 750.095161] env[65680]: DEBUG oslo_concurrency.lockutils [req-0687f47b-d1f5-442c-b4a6-4fad0a023179 req-e4b0a711-71b2-417f-9a89-e69a5b8e0339 service nova] Lock "b935e1a7-1c77-4398-a964-cd7da312fc1b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 750.095161] env[65680]: DEBUG oslo_concurrency.lockutils [req-0687f47b-d1f5-442c-b4a6-4fad0a023179 req-e4b0a711-71b2-417f-9a89-e69a5b8e0339 service nova] Lock "b935e1a7-1c77-4398-a964-cd7da312fc1b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 750.095489] env[65680]: DEBUG nova.compute.manager [req-0687f47b-d1f5-442c-b4a6-4fad0a023179 req-e4b0a711-71b2-417f-9a89-e69a5b8e0339 service nova] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] No waiting events found dispatching network-vif-plugged-74d49725-9616-4fce-9264-7c6d80f19f05 {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 750.096262] env[65680]: WARNING nova.compute.manager [req-0687f47b-d1f5-442c-b4a6-4fad0a023179 req-e4b0a711-71b2-417f-9a89-e69a5b8e0339 service nova] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Received unexpected event network-vif-plugged-74d49725-9616-4fce-9264-7c6d80f19f05 for instance with vm_state building and task_state spawning. [ 750.135941] env[65680]: DEBUG nova.network.neutron [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Successfully updated port: 74d49725-9616-4fce-9264-7c6d80f19f05 {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 750.145697] env[65680]: DEBUG oslo_concurrency.lockutils [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Acquiring lock "refresh_cache-b935e1a7-1c77-4398-a964-cd7da312fc1b" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 750.145697] env[65680]: DEBUG oslo_concurrency.lockutils [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Acquired lock "refresh_cache-b935e1a7-1c77-4398-a964-cd7da312fc1b" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 750.145697] env[65680]: DEBUG nova.network.neutron [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 750.189707] env[65680]: DEBUG nova.network.neutron [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 750.388204] env[65680]: DEBUG nova.network.neutron [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Updating instance_info_cache with network_info: [{"id": "74d49725-9616-4fce-9264-7c6d80f19f05", "address": "fa:16:3e:fa:af:24", "network": {"id": "97274db0-8eed-45dc-a1c3-2f1695916478", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-2121286750-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30fcdee2e31a4e94971b12026ac1f3e4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "678ebbe4-4c53-4eaf-a689-93981310f37d", "external-id": "nsx-vlan-transportzone-443", "segmentation_id": 443, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap74d49725-96", "ovs_interfaceid": "74d49725-9616-4fce-9264-7c6d80f19f05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 750.402601] env[65680]: DEBUG oslo_concurrency.lockutils [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Releasing lock "refresh_cache-b935e1a7-1c77-4398-a964-cd7da312fc1b" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 750.402908] env[65680]: DEBUG nova.compute.manager [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Instance network_info: |[{"id": "74d49725-9616-4fce-9264-7c6d80f19f05", "address": "fa:16:3e:fa:af:24", "network": {"id": "97274db0-8eed-45dc-a1c3-2f1695916478", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-2121286750-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30fcdee2e31a4e94971b12026ac1f3e4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "678ebbe4-4c53-4eaf-a689-93981310f37d", "external-id": "nsx-vlan-transportzone-443", "segmentation_id": 443, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap74d49725-96", "ovs_interfaceid": "74d49725-9616-4fce-9264-7c6d80f19f05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 750.403294] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:fa:af:24', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '678ebbe4-4c53-4eaf-a689-93981310f37d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '74d49725-9616-4fce-9264-7c6d80f19f05', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 750.415181] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Creating folder: Project (30fcdee2e31a4e94971b12026ac1f3e4). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 750.415732] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-591c0f41-8796-4ff8-a775-12bb7e46ca9f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.427796] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Created folder: Project (30fcdee2e31a4e94971b12026ac1f3e4) in parent group-v572532. [ 750.427971] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Creating folder: Instances. Parent ref: group-v572580. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 750.428213] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b855a27f-47c7-45d5-8aac-f0630180acf1 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.437356] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Created folder: Instances in parent group-v572580. [ 750.437583] env[65680]: DEBUG oslo.service.loopingcall [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 750.437768] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 750.438084] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bea2eab2-e83f-44d3-bb54-a1b0ea7357e5 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.457864] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 750.457864] env[65680]: value = "task-2847898" [ 750.457864] env[65680]: _type = "Task" [ 750.457864] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 750.465672] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847898, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 750.968344] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847898, 'name': CreateVM_Task, 'duration_secs': 0.292978} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 750.968512] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 750.969177] env[65680]: DEBUG oslo_concurrency.lockutils [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 750.969339] env[65680]: DEBUG oslo_concurrency.lockutils [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 750.969643] env[65680]: DEBUG oslo_concurrency.lockutils [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 750.969883] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2f6d5508-a3b7-413f-a787-fcbd7e9eecec {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.974285] env[65680]: DEBUG oslo_vmware.api [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Waiting for the task: (returnval){ [ 750.974285] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]521c7082-c524-f4d8-3808-db92f8f4b12b" [ 750.974285] env[65680]: _type = "Task" [ 750.974285] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 750.982614] env[65680]: DEBUG oslo_vmware.api [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]521c7082-c524-f4d8-3808-db92f8f4b12b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 751.485065] env[65680]: DEBUG oslo_concurrency.lockutils [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 751.485350] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 751.485539] env[65680]: DEBUG oslo_concurrency.lockutils [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 752.118010] env[65680]: DEBUG nova.compute.manager [req-eab0f5a5-a42c-4b28-b734-89a057faf6d6 req-fd09166c-1c37-4dfa-950a-a026b3ebf1c9 service nova] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Received event network-changed-74d49725-9616-4fce-9264-7c6d80f19f05 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 752.118222] env[65680]: DEBUG nova.compute.manager [req-eab0f5a5-a42c-4b28-b734-89a057faf6d6 req-fd09166c-1c37-4dfa-950a-a026b3ebf1c9 service nova] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Refreshing instance network info cache due to event network-changed-74d49725-9616-4fce-9264-7c6d80f19f05. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 752.118434] env[65680]: DEBUG oslo_concurrency.lockutils [req-eab0f5a5-a42c-4b28-b734-89a057faf6d6 req-fd09166c-1c37-4dfa-950a-a026b3ebf1c9 service nova] Acquiring lock "refresh_cache-b935e1a7-1c77-4398-a964-cd7da312fc1b" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 752.118579] env[65680]: DEBUG oslo_concurrency.lockutils [req-eab0f5a5-a42c-4b28-b734-89a057faf6d6 req-fd09166c-1c37-4dfa-950a-a026b3ebf1c9 service nova] Acquired lock "refresh_cache-b935e1a7-1c77-4398-a964-cd7da312fc1b" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 752.118723] env[65680]: DEBUG nova.network.neutron [req-eab0f5a5-a42c-4b28-b734-89a057faf6d6 req-fd09166c-1c37-4dfa-950a-a026b3ebf1c9 service nova] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Refreshing network info cache for port 74d49725-9616-4fce-9264-7c6d80f19f05 {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 752.385826] env[65680]: DEBUG nova.network.neutron [req-eab0f5a5-a42c-4b28-b734-89a057faf6d6 req-fd09166c-1c37-4dfa-950a-a026b3ebf1c9 service nova] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Updated VIF entry in instance network info cache for port 74d49725-9616-4fce-9264-7c6d80f19f05. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 752.386189] env[65680]: DEBUG nova.network.neutron [req-eab0f5a5-a42c-4b28-b734-89a057faf6d6 req-fd09166c-1c37-4dfa-950a-a026b3ebf1c9 service nova] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Updating instance_info_cache with network_info: [{"id": "74d49725-9616-4fce-9264-7c6d80f19f05", "address": "fa:16:3e:fa:af:24", "network": {"id": "97274db0-8eed-45dc-a1c3-2f1695916478", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-2121286750-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30fcdee2e31a4e94971b12026ac1f3e4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "678ebbe4-4c53-4eaf-a689-93981310f37d", "external-id": "nsx-vlan-transportzone-443", "segmentation_id": 443, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap74d49725-96", "ovs_interfaceid": "74d49725-9616-4fce-9264-7c6d80f19f05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 752.398578] env[65680]: DEBUG oslo_concurrency.lockutils [req-eab0f5a5-a42c-4b28-b734-89a057faf6d6 req-fd09166c-1c37-4dfa-950a-a026b3ebf1c9 service nova] Releasing lock "refresh_cache-b935e1a7-1c77-4398-a964-cd7da312fc1b" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 755.229903] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ec61727a-53e7-49af-81ce-61a24abc3ede tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Acquiring lock "53d9fa9e-f645-47a7-9990-4ad3bfb4ca45" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 757.080138] env[65680]: DEBUG oslo_concurrency.lockutils [None req-30fec49e-cc4a-4e2c-a448-a8aa1958bc2f tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Acquiring lock "059f5688-3497-40bd-bf18-9c0748f3bdd6" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 764.554318] env[65680]: DEBUG oslo_concurrency.lockutils [None req-9b922bd1-9c81-43be-a749-982d865f3b8f tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Acquiring lock "fc14c935-fe84-4a49-ac1e-575e56b672a3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 765.828443] env[65680]: DEBUG oslo_concurrency.lockutils [None req-d11d922d-45f5-4e89-b58a-78cfb2788a57 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Acquiring lock "acbe2170-7ce3-4820-b082-6680e559bde1" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 796.670100] env[65680]: WARNING oslo_vmware.rw_handles [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 796.670100] env[65680]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 796.670100] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 796.670100] env[65680]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 796.670100] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 796.670100] env[65680]: ERROR oslo_vmware.rw_handles response.begin() [ 796.670100] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 796.670100] env[65680]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 796.670100] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 796.670100] env[65680]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 796.670100] env[65680]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 796.670100] env[65680]: ERROR oslo_vmware.rw_handles [ 796.670795] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Downloaded image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to vmware_temp/0a273832-7eed-4f9f-a942-09bb4938ebad/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 796.672981] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Caching image {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 796.672981] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Copying Virtual Disk [datastore1] vmware_temp/0a273832-7eed-4f9f-a942-09bb4938ebad/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk to [datastore1] vmware_temp/0a273832-7eed-4f9f-a942-09bb4938ebad/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk {{(pid=65680) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 796.673329] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6db4f84c-d9fc-430e-818e-c36269909f04 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.682937] env[65680]: DEBUG oslo_vmware.api [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Waiting for the task: (returnval){ [ 796.682937] env[65680]: value = "task-2847899" [ 796.682937] env[65680]: _type = "Task" [ 796.682937] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 796.690925] env[65680]: DEBUG oslo_vmware.api [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Task: {'id': task-2847899, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 797.193775] env[65680]: DEBUG oslo_vmware.exceptions [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Fault InvalidArgument not matched. {{(pid=65680) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 797.194054] env[65680]: DEBUG oslo_concurrency.lockutils [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 797.194610] env[65680]: ERROR nova.compute.manager [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 797.194610] env[65680]: Faults: ['InvalidArgument'] [ 797.194610] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Traceback (most recent call last): [ 797.194610] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 797.194610] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] yield resources [ 797.194610] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 797.194610] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] self.driver.spawn(context, instance, image_meta, [ 797.194610] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 797.194610] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] self._vmops.spawn(context, instance, image_meta, injected_files, [ 797.194610] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 797.194610] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] self._fetch_image_if_missing(context, vi) [ 797.194610] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 797.194997] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] image_cache(vi, tmp_image_ds_loc) [ 797.194997] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 797.194997] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] vm_util.copy_virtual_disk( [ 797.194997] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 797.194997] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] session._wait_for_task(vmdk_copy_task) [ 797.194997] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 797.194997] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] return self.wait_for_task(task_ref) [ 797.194997] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 797.194997] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] return evt.wait() [ 797.194997] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 797.194997] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] result = hub.switch() [ 797.194997] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 797.194997] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] return self.greenlet.switch() [ 797.195409] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 797.195409] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] self.f(*self.args, **self.kw) [ 797.195409] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 797.195409] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] raise exceptions.translate_fault(task_info.error) [ 797.195409] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 797.195409] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Faults: ['InvalidArgument'] [ 797.195409] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] [ 797.195409] env[65680]: INFO nova.compute.manager [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Terminating instance [ 797.196439] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 797.196663] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 797.196890] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6f21434d-3477-4eb9-b781-e009eb862577 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.199194] env[65680]: DEBUG nova.compute.manager [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 797.199433] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 797.200141] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-525e48e5-c6e9-46ee-b3b6-39676fe9ce6e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.206379] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 797.206594] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f1dfdb1f-4997-4aa5-823e-dc6f621fae27 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.208681] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 797.208850] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 797.209756] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1469beaa-1d88-4c87-b38d-8a991986f151 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.214377] env[65680]: DEBUG oslo_vmware.api [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Waiting for the task: (returnval){ [ 797.214377] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52169030-fc00-e239-1d73-9adf8a5e65c5" [ 797.214377] env[65680]: _type = "Task" [ 797.214377] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 797.220938] env[65680]: DEBUG oslo_vmware.api [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52169030-fc00-e239-1d73-9adf8a5e65c5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 797.279686] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 797.279906] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 797.280095] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Deleting the datastore file [datastore1] 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45 {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 797.280353] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-32e0aa7d-41ca-4cd4-9349-6d84bfeaf814 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.287178] env[65680]: DEBUG oslo_vmware.api [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Waiting for the task: (returnval){ [ 797.287178] env[65680]: value = "task-2847901" [ 797.287178] env[65680]: _type = "Task" [ 797.287178] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 797.294816] env[65680]: DEBUG oslo_vmware.api [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Task: {'id': task-2847901, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 797.724853] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 797.725172] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Creating directory with path [datastore1] vmware_temp/c05b3b82-ff47-4ca9-8b69-3de5cfeea83c/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 797.726033] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-72bb0028-1d73-447a-b687-32e7b1bc5824 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.737337] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Created directory with path [datastore1] vmware_temp/c05b3b82-ff47-4ca9-8b69-3de5cfeea83c/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 797.737533] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Fetch image to [datastore1] vmware_temp/c05b3b82-ff47-4ca9-8b69-3de5cfeea83c/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 797.737704] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/c05b3b82-ff47-4ca9-8b69-3de5cfeea83c/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 797.738445] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb0c7467-db5b-40d1-9a34-a5f32f39a3d7 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.745157] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9391a052-002b-42ce-97a3-099ec1497fed {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.754115] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07a829e2-6304-4ba8-b042-bd239011f756 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.787278] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0060ced-6c65-47c2-baa3-4c9c20c272b7 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.798995] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-177cacb8-bd69-4eab-a6c8-c9cdee55b5b4 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.801291] env[65680]: DEBUG oslo_vmware.api [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Task: {'id': task-2847901, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.062838} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 797.801552] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 797.801732] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 797.801897] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 797.802080] env[65680]: INFO nova.compute.manager [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Took 0.60 seconds to destroy the instance on the hypervisor. [ 797.804172] env[65680]: DEBUG nova.compute.claims [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 797.804346] env[65680]: DEBUG oslo_concurrency.lockutils [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 797.804568] env[65680]: DEBUG oslo_concurrency.lockutils [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 797.893235] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 797.939530] env[65680]: DEBUG oslo_vmware.rw_handles [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c05b3b82-ff47-4ca9-8b69-3de5cfeea83c/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 797.997870] env[65680]: DEBUG oslo_vmware.rw_handles [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Completed reading data from the image iterator. {{(pid=65680) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 797.997870] env[65680]: DEBUG oslo_vmware.rw_handles [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c05b3b82-ff47-4ca9-8b69-3de5cfeea83c/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 798.146907] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4e26fb3-66f7-40fb-b7c3-2bb6f2f26a54 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.154720] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4f22a21-dd30-4910-ab69-a2622675dffc {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.184540] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96a1f73c-6e0a-4c00-83f5-d92a4ecb4628 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.191265] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6bb8e5f-f07a-4926-adfd-694dc8a77855 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.203947] env[65680]: DEBUG nova.compute.provider_tree [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 798.212183] env[65680]: DEBUG nova.scheduler.client.report [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 798.228848] env[65680]: DEBUG oslo_concurrency.lockutils [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.424s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 798.229385] env[65680]: ERROR nova.compute.manager [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 798.229385] env[65680]: Faults: ['InvalidArgument'] [ 798.229385] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Traceback (most recent call last): [ 798.229385] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 798.229385] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] self.driver.spawn(context, instance, image_meta, [ 798.229385] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 798.229385] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] self._vmops.spawn(context, instance, image_meta, injected_files, [ 798.229385] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 798.229385] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] self._fetch_image_if_missing(context, vi) [ 798.229385] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 798.229385] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] image_cache(vi, tmp_image_ds_loc) [ 798.229385] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 798.229773] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] vm_util.copy_virtual_disk( [ 798.229773] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 798.229773] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] session._wait_for_task(vmdk_copy_task) [ 798.229773] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 798.229773] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] return self.wait_for_task(task_ref) [ 798.229773] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 798.229773] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] return evt.wait() [ 798.229773] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 798.229773] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] result = hub.switch() [ 798.229773] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 798.229773] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] return self.greenlet.switch() [ 798.229773] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 798.229773] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] self.f(*self.args, **self.kw) [ 798.230164] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 798.230164] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] raise exceptions.translate_fault(task_info.error) [ 798.230164] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 798.230164] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Faults: ['InvalidArgument'] [ 798.230164] env[65680]: ERROR nova.compute.manager [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] [ 798.230164] env[65680]: DEBUG nova.compute.utils [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] VimFaultException {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 798.231474] env[65680]: DEBUG nova.compute.manager [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Build of instance 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45 was re-scheduled: A specified parameter was not correct: fileType [ 798.231474] env[65680]: Faults: ['InvalidArgument'] {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 798.231858] env[65680]: DEBUG nova.compute.manager [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 798.232037] env[65680]: DEBUG nova.compute.manager [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 798.232211] env[65680]: DEBUG nova.compute.manager [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 798.232377] env[65680]: DEBUG nova.network.neutron [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 798.523173] env[65680]: DEBUG nova.network.neutron [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 798.534660] env[65680]: INFO nova.compute.manager [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Took 0.30 seconds to deallocate network for instance. [ 798.633774] env[65680]: INFO nova.scheduler.client.report [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Deleted allocations for instance 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45 [ 798.651339] env[65680]: DEBUG oslo_concurrency.lockutils [None req-75114fa9-a181-4f60-8831-9fe1cf9e363f tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Lock "53d9fa9e-f645-47a7-9990-4ad3bfb4ca45" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 242.349s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 798.652504] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "53d9fa9e-f645-47a7-9990-4ad3bfb4ca45" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 240.175s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 798.652715] env[65680]: INFO nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] During sync_power_state the instance has a pending task (spawning). Skip. [ 798.652890] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "53d9fa9e-f645-47a7-9990-4ad3bfb4ca45" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 798.653484] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ec61727a-53e7-49af-81ce-61a24abc3ede tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Lock "53d9fa9e-f645-47a7-9990-4ad3bfb4ca45" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 43.424s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 798.653718] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ec61727a-53e7-49af-81ce-61a24abc3ede tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Acquiring lock "53d9fa9e-f645-47a7-9990-4ad3bfb4ca45-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 798.653908] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ec61727a-53e7-49af-81ce-61a24abc3ede tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Lock "53d9fa9e-f645-47a7-9990-4ad3bfb4ca45-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 798.654819] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ec61727a-53e7-49af-81ce-61a24abc3ede tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Lock "53d9fa9e-f645-47a7-9990-4ad3bfb4ca45-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 798.656587] env[65680]: INFO nova.compute.manager [None req-ec61727a-53e7-49af-81ce-61a24abc3ede tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Terminating instance [ 798.660782] env[65680]: DEBUG nova.compute.manager [None req-ec61727a-53e7-49af-81ce-61a24abc3ede tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 798.660782] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-ec61727a-53e7-49af-81ce-61a24abc3ede tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 798.660782] env[65680]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9d70442d-23dd-4a03-9e3b-17682f9a948d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.675532] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c91a8b52-ca08-4623-95f4-6d55529a3a7e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.682557] env[65680]: DEBUG nova.compute.manager [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 798.703607] env[65680]: WARNING nova.virt.vmwareapi.vmops [None req-ec61727a-53e7-49af-81ce-61a24abc3ede tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45 could not be found. [ 798.703848] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-ec61727a-53e7-49af-81ce-61a24abc3ede tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 798.704044] env[65680]: INFO nova.compute.manager [None req-ec61727a-53e7-49af-81ce-61a24abc3ede tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Took 0.04 seconds to destroy the instance on the hypervisor. [ 798.704325] env[65680]: DEBUG oslo.service.loopingcall [None req-ec61727a-53e7-49af-81ce-61a24abc3ede tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 798.704517] env[65680]: DEBUG nova.compute.manager [-] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 798.704616] env[65680]: DEBUG nova.network.neutron [-] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 798.728410] env[65680]: DEBUG nova.network.neutron [-] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 798.733995] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 798.734272] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 798.735779] env[65680]: INFO nova.compute.claims [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 798.742036] env[65680]: INFO nova.compute.manager [-] [instance: 53d9fa9e-f645-47a7-9990-4ad3bfb4ca45] Took 0.03 seconds to deallocate network for instance. [ 798.856684] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ec61727a-53e7-49af-81ce-61a24abc3ede tempest-FloatingIPsAssociationTestJSON-1307181314 tempest-FloatingIPsAssociationTestJSON-1307181314-project-member] Lock "53d9fa9e-f645-47a7-9990-4ad3bfb4ca45" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.203s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 799.074682] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92efefd1-4ec1-4823-83db-088bfe551d3e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.082414] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bfe68d6-5028-48c7-8bca-75bbb3520640 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.111033] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-418d22ac-3bad-42d5-8b50-29c1efc4266b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.118056] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-703dcd77-adab-4116-bc2b-59e72e47e790 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.131067] env[65680]: DEBUG nova.compute.provider_tree [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 799.140166] env[65680]: DEBUG nova.scheduler.client.report [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 799.153051] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.419s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 799.153458] env[65680]: DEBUG nova.compute.manager [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 799.185048] env[65680]: DEBUG nova.compute.utils [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 799.186554] env[65680]: DEBUG nova.compute.manager [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 799.186675] env[65680]: DEBUG nova.network.neutron [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 799.194107] env[65680]: DEBUG nova.compute.manager [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 799.238366] env[65680]: DEBUG nova.policy [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '01dddda9856a48da8743a705976a7f32', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2c73ce2fa3594e49b3a84c5645e34107', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 799.253887] env[65680]: DEBUG nova.compute.manager [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 799.274468] env[65680]: DEBUG nova.virt.hardware [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 799.274706] env[65680]: DEBUG nova.virt.hardware [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 799.274880] env[65680]: DEBUG nova.virt.hardware [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 799.275109] env[65680]: DEBUG nova.virt.hardware [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 799.275265] env[65680]: DEBUG nova.virt.hardware [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 799.275408] env[65680]: DEBUG nova.virt.hardware [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 799.275615] env[65680]: DEBUG nova.virt.hardware [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 799.275777] env[65680]: DEBUG nova.virt.hardware [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 799.275931] env[65680]: DEBUG nova.virt.hardware [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 799.276102] env[65680]: DEBUG nova.virt.hardware [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 799.276272] env[65680]: DEBUG nova.virt.hardware [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 799.277118] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45e63450-5ca5-489e-8cd9-b9d5443d566b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.284886] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d44f954d-6902-4ddc-9dcb-39e9bf0bdb1a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.534543] env[65680]: DEBUG nova.network.neutron [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Successfully created port: a73735b5-32d4-4c27-9106-b3baaf3b18d4 {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 800.655467] env[65680]: DEBUG nova.compute.manager [req-d2d75ad3-092a-4008-862e-e1c538dbb354 req-ba3fd983-0698-4a6e-af6e-1fc623f6b1f6 service nova] [instance: cb739449-a329-41b8-964c-8c9db383e846] Received event network-vif-plugged-a73735b5-32d4-4c27-9106-b3baaf3b18d4 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 800.655742] env[65680]: DEBUG oslo_concurrency.lockutils [req-d2d75ad3-092a-4008-862e-e1c538dbb354 req-ba3fd983-0698-4a6e-af6e-1fc623f6b1f6 service nova] Acquiring lock "cb739449-a329-41b8-964c-8c9db383e846-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 800.656100] env[65680]: DEBUG oslo_concurrency.lockutils [req-d2d75ad3-092a-4008-862e-e1c538dbb354 req-ba3fd983-0698-4a6e-af6e-1fc623f6b1f6 service nova] Lock "cb739449-a329-41b8-964c-8c9db383e846-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 800.656214] env[65680]: DEBUG oslo_concurrency.lockutils [req-d2d75ad3-092a-4008-862e-e1c538dbb354 req-ba3fd983-0698-4a6e-af6e-1fc623f6b1f6 service nova] Lock "cb739449-a329-41b8-964c-8c9db383e846-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 800.656377] env[65680]: DEBUG nova.compute.manager [req-d2d75ad3-092a-4008-862e-e1c538dbb354 req-ba3fd983-0698-4a6e-af6e-1fc623f6b1f6 service nova] [instance: cb739449-a329-41b8-964c-8c9db383e846] No waiting events found dispatching network-vif-plugged-a73735b5-32d4-4c27-9106-b3baaf3b18d4 {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 800.656540] env[65680]: WARNING nova.compute.manager [req-d2d75ad3-092a-4008-862e-e1c538dbb354 req-ba3fd983-0698-4a6e-af6e-1fc623f6b1f6 service nova] [instance: cb739449-a329-41b8-964c-8c9db383e846] Received unexpected event network-vif-plugged-a73735b5-32d4-4c27-9106-b3baaf3b18d4 for instance with vm_state building and task_state spawning. [ 800.777531] env[65680]: DEBUG nova.network.neutron [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Successfully updated port: a73735b5-32d4-4c27-9106-b3baaf3b18d4 {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 800.791305] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Acquiring lock "refresh_cache-cb739449-a329-41b8-964c-8c9db383e846" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 800.791486] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Acquired lock "refresh_cache-cb739449-a329-41b8-964c-8c9db383e846" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 800.791657] env[65680]: DEBUG nova.network.neutron [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 800.877988] env[65680]: DEBUG nova.network.neutron [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 801.420840] env[65680]: DEBUG nova.network.neutron [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Updating instance_info_cache with network_info: [{"id": "a73735b5-32d4-4c27-9106-b3baaf3b18d4", "address": "fa:16:3e:9f:b7:28", "network": {"id": "806968c4-e38e-4680-9bec-95b0c36bc48d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-263352403-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2c73ce2fa3594e49b3a84c5645e34107", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "11da2092-76f7-447e-babb-8fc14ad39a71", "external-id": "nsx-vlan-transportzone-585", "segmentation_id": 585, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa73735b5-32", "ovs_interfaceid": "a73735b5-32d4-4c27-9106-b3baaf3b18d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 801.434992] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Releasing lock "refresh_cache-cb739449-a329-41b8-964c-8c9db383e846" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 801.435311] env[65680]: DEBUG nova.compute.manager [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Instance network_info: |[{"id": "a73735b5-32d4-4c27-9106-b3baaf3b18d4", "address": "fa:16:3e:9f:b7:28", "network": {"id": "806968c4-e38e-4680-9bec-95b0c36bc48d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-263352403-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2c73ce2fa3594e49b3a84c5645e34107", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "11da2092-76f7-447e-babb-8fc14ad39a71", "external-id": "nsx-vlan-transportzone-585", "segmentation_id": 585, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa73735b5-32", "ovs_interfaceid": "a73735b5-32d4-4c27-9106-b3baaf3b18d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 801.435681] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9f:b7:28', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '11da2092-76f7-447e-babb-8fc14ad39a71', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a73735b5-32d4-4c27-9106-b3baaf3b18d4', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 801.443131] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Creating folder: Project (2c73ce2fa3594e49b3a84c5645e34107). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 801.443638] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4ee83233-ddcb-4def-92aa-db5de2406f4c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.455257] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Created folder: Project (2c73ce2fa3594e49b3a84c5645e34107) in parent group-v572532. [ 801.455481] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Creating folder: Instances. Parent ref: group-v572583. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 801.455748] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5b9df296-2f36-4fa6-8251-7a94f5f526be {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.464480] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Created folder: Instances in parent group-v572583. [ 801.465687] env[65680]: DEBUG oslo.service.loopingcall [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 801.465888] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cb739449-a329-41b8-964c-8c9db383e846] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 801.466100] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3cbe5ff8-4b01-4f42-83e6-fc4079465965 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.490073] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 801.490073] env[65680]: value = "task-2847904" [ 801.490073] env[65680]: _type = "Task" [ 801.490073] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 801.496768] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847904, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 801.999906] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847904, 'name': CreateVM_Task, 'duration_secs': 0.319299} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 801.999906] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cb739449-a329-41b8-964c-8c9db383e846] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 802.000177] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 802.000294] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 802.000621] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 802.000871] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4b8596d0-585a-4d14-8d33-fdf5ee394b83 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.005841] env[65680]: DEBUG oslo_vmware.api [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Waiting for the task: (returnval){ [ 802.005841] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52a7e174-d371-13ec-66ca-2e00677dd8ee" [ 802.005841] env[65680]: _type = "Task" [ 802.005841] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 802.013078] env[65680]: DEBUG oslo_vmware.api [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52a7e174-d371-13ec-66ca-2e00677dd8ee, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 802.293165] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 802.293226] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 802.293344] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Cleaning up deleted instances {{(pid=65680) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 802.308352] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] There are 1 instances to clean {{(pid=65680) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 802.308753] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: d98c190b-7d45-4e74-909d-75b38bfc6554] Instance has had 0 of 5 cleanup attempts {{(pid=65680) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 802.346908] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 802.346908] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Cleaning up deleted instances with incomplete migration {{(pid=65680) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 802.356170] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 802.514994] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 802.515296] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 802.515509] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 802.795724] env[65680]: DEBUG nova.compute.manager [req-b2327277-1732-4353-a305-e8015aa2ddf0 req-1b7df5d5-a9b9-453a-8566-2e906cd46f50 service nova] [instance: cb739449-a329-41b8-964c-8c9db383e846] Received event network-changed-a73735b5-32d4-4c27-9106-b3baaf3b18d4 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 802.795849] env[65680]: DEBUG nova.compute.manager [req-b2327277-1732-4353-a305-e8015aa2ddf0 req-1b7df5d5-a9b9-453a-8566-2e906cd46f50 service nova] [instance: cb739449-a329-41b8-964c-8c9db383e846] Refreshing instance network info cache due to event network-changed-a73735b5-32d4-4c27-9106-b3baaf3b18d4. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 802.798183] env[65680]: DEBUG oslo_concurrency.lockutils [req-b2327277-1732-4353-a305-e8015aa2ddf0 req-1b7df5d5-a9b9-453a-8566-2e906cd46f50 service nova] Acquiring lock "refresh_cache-cb739449-a329-41b8-964c-8c9db383e846" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 802.798183] env[65680]: DEBUG oslo_concurrency.lockutils [req-b2327277-1732-4353-a305-e8015aa2ddf0 req-1b7df5d5-a9b9-453a-8566-2e906cd46f50 service nova] Acquired lock "refresh_cache-cb739449-a329-41b8-964c-8c9db383e846" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 802.798183] env[65680]: DEBUG nova.network.neutron [req-b2327277-1732-4353-a305-e8015aa2ddf0 req-1b7df5d5-a9b9-453a-8566-2e906cd46f50 service nova] [instance: cb739449-a329-41b8-964c-8c9db383e846] Refreshing network info cache for port a73735b5-32d4-4c27-9106-b3baaf3b18d4 {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 803.085897] env[65680]: DEBUG nova.network.neutron [req-b2327277-1732-4353-a305-e8015aa2ddf0 req-1b7df5d5-a9b9-453a-8566-2e906cd46f50 service nova] [instance: cb739449-a329-41b8-964c-8c9db383e846] Updated VIF entry in instance network info cache for port a73735b5-32d4-4c27-9106-b3baaf3b18d4. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 803.086204] env[65680]: DEBUG nova.network.neutron [req-b2327277-1732-4353-a305-e8015aa2ddf0 req-1b7df5d5-a9b9-453a-8566-2e906cd46f50 service nova] [instance: cb739449-a329-41b8-964c-8c9db383e846] Updating instance_info_cache with network_info: [{"id": "a73735b5-32d4-4c27-9106-b3baaf3b18d4", "address": "fa:16:3e:9f:b7:28", "network": {"id": "806968c4-e38e-4680-9bec-95b0c36bc48d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-263352403-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2c73ce2fa3594e49b3a84c5645e34107", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "11da2092-76f7-447e-babb-8fc14ad39a71", "external-id": "nsx-vlan-transportzone-585", "segmentation_id": 585, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa73735b5-32", "ovs_interfaceid": "a73735b5-32d4-4c27-9106-b3baaf3b18d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 803.099485] env[65680]: DEBUG oslo_concurrency.lockutils [req-b2327277-1732-4353-a305-e8015aa2ddf0 req-1b7df5d5-a9b9-453a-8566-2e906cd46f50 service nova] Releasing lock "refresh_cache-cb739449-a329-41b8-964c-8c9db383e846" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 803.359828] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 804.293174] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 804.293473] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Starting heal instance info cache {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 804.293516] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Rebuilding the list of instances to heal {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 804.325406] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 804.325569] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 804.325702] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 804.325830] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 804.325956] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 804.326104] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 804.326230] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 804.326350] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 804.326468] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 804.326587] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: cb739449-a329-41b8-964c-8c9db383e846] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 804.326709] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Didn't find any instances for network info cache update. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 804.327569] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 806.293387] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 807.292765] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 807.292929] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager.update_available_resource {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 807.305115] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 807.305371] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 807.305500] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 807.305658] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65680) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 807.306735] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a66da32d-7ea7-419e-874c-d3fd92dd063b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 807.324624] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af02ffde-f18c-4962-b1cd-3d680ac94ca1 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 807.333742] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Acquiring lock "e5d6d263-463e-46b8-9bb3-d10a4101d4e0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 807.333742] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Lock "e5d6d263-463e-46b8-9bb3-d10a4101d4e0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 807.343270] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12908599-658b-40c0-b526-e486b0f7438f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 807.355565] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a09c6db8-005d-4f69-a6b8-65c36c0c49ee {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 807.386634] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181057MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65680) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 807.388385] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 807.388385] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 807.510226] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 059f5688-3497-40bd-bf18-9c0748f3bdd6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 807.510226] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance fc14c935-fe84-4a49-ac1e-575e56b672a3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 807.510226] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance acbe2170-7ce3-4820-b082-6680e559bde1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 807.510412] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance f05204a0-268f-4d77-a2bf-cde4ee02915e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 807.510450] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance f989cbee-9d5c-459f-b7a0-bf2259dadbb0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 807.511041] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 40a7ee3c-8627-47f3-887e-31112586e799 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 807.511041] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 2f6ce1b8-d869-4219-851a-43ae3ddd3816 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 807.511041] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance b163d5b8-b01c-4ace-96e7-56276ab4ba82 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 807.511041] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance b935e1a7-1c77-4398-a964-cd7da312fc1b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 807.511255] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance cb739449-a329-41b8-964c-8c9db383e846 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 807.523131] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 3c728886-a983-4071-a728-25d87770556f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 807.533329] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance ba875739-2ff0-4778-89cf-5b32f2ffe6fd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 807.543520] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance d1f6ea52-4367-4756-88ff-37830ce1aeba has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 807.553111] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 11374639-ed45-4999-b8b9-fdbf08b9d8bd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 807.561999] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance a3876ce4-3e1d-4450-896c-b8321cc1a312 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 807.575323] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance b663e64f-77a5-4938-a492-df6f05bb182e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 807.584588] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance c9022be0-026f-4f6a-b720-162cadcd76bb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 807.594551] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance ce17e81b-0291-4309-8594-28ea20c530a9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 807.606368] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 53010485-3888-4669-85b7-01381f0bffcd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 807.618942] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 017fddb0-49d5-434d-997d-126119a989ef has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 807.631517] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance d7b468db-115d-4b24-b604-5edb176dbf96 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 807.642968] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance e5d6d263-463e-46b8-9bb3-d10a4101d4e0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 807.643230] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 807.643382] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 807.659441] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Refreshing inventories for resource provider 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 807.674119] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Updating ProviderTree inventory for provider 93ae29e4-bd04-4c19-80be-8057217cf400 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 807.674343] env[65680]: DEBUG nova.compute.provider_tree [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Updating inventory in ProviderTree for provider 93ae29e4-bd04-4c19-80be-8057217cf400 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 807.685444] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Refreshing aggregate associations for resource provider 93ae29e4-bd04-4c19-80be-8057217cf400, aggregates: None {{(pid=65680) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 807.701723] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Refreshing trait associations for resource provider 93ae29e4-bd04-4c19-80be-8057217cf400, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=65680) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 807.934301] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99a4acaa-e1d2-4d7c-b4f6-b6093bf23aed {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 807.941830] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc56d3c8-5914-4f27-b81b-a2cb96b257fd {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 807.971704] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b4354fb-4cf2-4135-ad18-ee15e3d93927 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 807.978493] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72ff7e87-f75f-4ebc-825e-90169e463360 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 807.991506] env[65680]: DEBUG nova.compute.provider_tree [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 808.000873] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 808.015680] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65680) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 808.015872] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.629s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 809.016136] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 809.016471] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65680) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 809.293568] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 831.246976] env[65680]: DEBUG nova.compute.manager [req-af81279d-6cfc-44a3-ba76-d3fe5ce1afcd req-4699e271-d2ca-48dc-9562-a7f9749f2ec1 service nova] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Received event network-vif-deleted-ded655a7-5ab4-4feb-ab0c-65a6d60d802e {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 832.029403] env[65680]: DEBUG nova.compute.manager [req-afe0a774-434e-4d3d-9243-98b611dc1e79 req-80b2b8dd-0f43-4996-a84a-37ae8a92af34 service nova] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Received event network-vif-deleted-6fd47fa7-f60f-4555-b8ee-8bd5b78a3825 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 833.800346] env[65680]: DEBUG nova.compute.manager [req-09422063-980b-4e8f-b4fc-f41f6390bf95 req-d5788970-fe11-4234-85d2-55995abae301 service nova] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Received event network-vif-deleted-2e0079a8-bad7-4a86-a3b1-1aed3ef0a7c7 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 834.762939] env[65680]: DEBUG nova.compute.manager [req-635b6ba0-af3e-46d2-a9d3-bd282e3cbec4 req-90bb1e08-3352-498e-bc5b-228e8ac42463 service nova] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Received event network-vif-deleted-94344f23-cda8-41df-a212-f809024b4ac3 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 836.726843] env[65680]: DEBUG nova.compute.manager [req-66288383-9ddd-4a2a-a099-2a82c6677a88 req-881ec05c-5942-4931-83ad-74749d605b52 service nova] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Received event network-vif-deleted-704ca6e3-8d5a-411b-9796-eb0201636d9c {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 836.726843] env[65680]: DEBUG nova.compute.manager [req-66288383-9ddd-4a2a-a099-2a82c6677a88 req-881ec05c-5942-4931-83ad-74749d605b52 service nova] [instance: cb739449-a329-41b8-964c-8c9db383e846] Received event network-vif-deleted-a73735b5-32d4-4c27-9106-b3baaf3b18d4 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 837.225575] env[65680]: DEBUG nova.compute.manager [req-93110e64-126d-46fa-9308-662129d4560d req-2b985ea1-fe46-4003-b5a8-0dfb0d44578a service nova] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Received event network-vif-deleted-74d49725-9616-4fce-9264-7c6d80f19f05 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 844.867167] env[65680]: WARNING oslo_vmware.rw_handles [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 844.867167] env[65680]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 844.867167] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 844.867167] env[65680]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 844.867167] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 844.867167] env[65680]: ERROR oslo_vmware.rw_handles response.begin() [ 844.867167] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 844.867167] env[65680]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 844.867167] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 844.867167] env[65680]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 844.867167] env[65680]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 844.867167] env[65680]: ERROR oslo_vmware.rw_handles [ 844.867809] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Downloaded image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to vmware_temp/c05b3b82-ff47-4ca9-8b69-3de5cfeea83c/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 844.873513] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Caching image {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 844.873513] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Copying Virtual Disk [datastore1] vmware_temp/c05b3b82-ff47-4ca9-8b69-3de5cfeea83c/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk to [datastore1] vmware_temp/c05b3b82-ff47-4ca9-8b69-3de5cfeea83c/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk {{(pid=65680) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 844.873840] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4e71805f-0776-410e-91da-0a66209f843c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 844.883275] env[65680]: DEBUG oslo_vmware.api [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Waiting for the task: (returnval){ [ 844.883275] env[65680]: value = "task-2847905" [ 844.883275] env[65680]: _type = "Task" [ 844.883275] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 844.892789] env[65680]: DEBUG oslo_vmware.api [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Task: {'id': task-2847905, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 845.402377] env[65680]: DEBUG oslo_vmware.exceptions [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Fault InvalidArgument not matched. {{(pid=65680) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 845.402377] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 845.402832] env[65680]: ERROR nova.compute.manager [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 845.402832] env[65680]: Faults: ['InvalidArgument'] [ 845.402832] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Traceback (most recent call last): [ 845.402832] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 845.402832] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] yield resources [ 845.402832] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 845.402832] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] self.driver.spawn(context, instance, image_meta, [ 845.402832] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 845.402832] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 845.402832] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 845.402832] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] self._fetch_image_if_missing(context, vi) [ 845.402832] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 845.403361] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] image_cache(vi, tmp_image_ds_loc) [ 845.403361] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 845.403361] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] vm_util.copy_virtual_disk( [ 845.403361] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 845.403361] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] session._wait_for_task(vmdk_copy_task) [ 845.403361] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 845.403361] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] return self.wait_for_task(task_ref) [ 845.403361] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 845.403361] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] return evt.wait() [ 845.403361] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 845.403361] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] result = hub.switch() [ 845.403361] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 845.403361] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] return self.greenlet.switch() [ 845.403758] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 845.403758] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] self.f(*self.args, **self.kw) [ 845.403758] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 845.403758] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] raise exceptions.translate_fault(task_info.error) [ 845.403758] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 845.403758] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Faults: ['InvalidArgument'] [ 845.403758] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] [ 845.403758] env[65680]: INFO nova.compute.manager [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Terminating instance [ 845.406363] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 845.406363] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 845.407152] env[65680]: DEBUG nova.compute.manager [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 845.407284] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 845.407733] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d1c28544-0f40-4200-a19f-c7b4002638da {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 845.411260] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e7d88ab-acc5-4256-89a9-7f5868b6a6dd {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 845.419881] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 845.421747] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-efe5d46d-b25d-44bd-adbf-bdb4633e7baf {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 845.423371] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 845.423991] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 845.424703] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-80a1faea-57e3-46f5-892a-575a7d39b626 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 845.431172] env[65680]: DEBUG oslo_vmware.api [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Waiting for the task: (returnval){ [ 845.431172] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52e8d8f2-d790-8751-b8c7-e5150d235c77" [ 845.431172] env[65680]: _type = "Task" [ 845.431172] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 845.441123] env[65680]: DEBUG oslo_vmware.api [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52e8d8f2-d790-8751-b8c7-e5150d235c77, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 845.497396] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 845.497396] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 845.497396] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Deleting the datastore file [datastore1] 059f5688-3497-40bd-bf18-9c0748f3bdd6 {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 845.497396] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-14a0cb92-4555-4a65-b574-f7396374fbf8 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 845.503803] env[65680]: DEBUG oslo_vmware.api [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Waiting for the task: (returnval){ [ 845.503803] env[65680]: value = "task-2847907" [ 845.503803] env[65680]: _type = "Task" [ 845.503803] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 845.514632] env[65680]: DEBUG oslo_vmware.api [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Task: {'id': task-2847907, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 845.945364] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 845.945742] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Creating directory with path [datastore1] vmware_temp/8e7db4c2-608d-40da-8973-ab83a01365fb/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 845.945742] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2a616ea4-05d3-482d-8802-70a18c273021 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 845.960020] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Created directory with path [datastore1] vmware_temp/8e7db4c2-608d-40da-8973-ab83a01365fb/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 845.960020] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Fetch image to [datastore1] vmware_temp/8e7db4c2-608d-40da-8973-ab83a01365fb/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 845.960020] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/8e7db4c2-608d-40da-8973-ab83a01365fb/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 845.960768] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7cbffe3-cf41-4fa6-b503-d5495e28a147 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 845.969134] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27aaad50-cd51-408d-9de8-bdd8e587cc50 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 845.979742] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d249990-966a-45f1-88c8-c1520175faaa {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 846.023498] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60e4f09b-b55f-4a20-814f-8f4757587968 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 846.031925] env[65680]: DEBUG oslo_vmware.api [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Task: {'id': task-2847907, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078437} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 846.033675] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 846.033867] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 846.034175] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 846.034271] env[65680]: INFO nova.compute.manager [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Took 0.63 seconds to destroy the instance on the hypervisor. [ 846.036118] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-71034028-fa8a-4db2-9053-339e5d97221c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 846.038493] env[65680]: DEBUG nova.compute.claims [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 846.038493] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 846.038744] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 846.062204] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 846.128176] env[65680]: DEBUG oslo_vmware.rw_handles [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8e7db4c2-608d-40da-8973-ab83a01365fb/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 846.193665] env[65680]: DEBUG oslo_vmware.rw_handles [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Completed reading data from the image iterator. {{(pid=65680) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 846.193743] env[65680]: DEBUG oslo_vmware.rw_handles [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8e7db4c2-608d-40da-8973-ab83a01365fb/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 846.260303] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d72efe84-54ea-4b61-b8d0-29dd84a10392 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 846.268949] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac5e796b-cb13-4a5f-ba2f-316b1d52d235 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 846.299502] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2565fcb9-a844-4ca7-ae18-726db9aafea7 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 846.307032] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c360556d-aac1-47a2-9d2e-afa6f16c3557 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 846.320581] env[65680]: DEBUG nova.compute.provider_tree [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 846.331672] env[65680]: DEBUG nova.scheduler.client.report [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 846.353878] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.314s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 846.353878] env[65680]: ERROR nova.compute.manager [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 846.353878] env[65680]: Faults: ['InvalidArgument'] [ 846.353878] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Traceback (most recent call last): [ 846.353878] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 846.353878] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] self.driver.spawn(context, instance, image_meta, [ 846.353878] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 846.353878] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 846.353878] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 846.353878] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] self._fetch_image_if_missing(context, vi) [ 846.354593] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 846.354593] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] image_cache(vi, tmp_image_ds_loc) [ 846.354593] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 846.354593] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] vm_util.copy_virtual_disk( [ 846.354593] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 846.354593] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] session._wait_for_task(vmdk_copy_task) [ 846.354593] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 846.354593] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] return self.wait_for_task(task_ref) [ 846.354593] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 846.354593] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] return evt.wait() [ 846.354593] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 846.354593] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] result = hub.switch() [ 846.354593] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 846.355238] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] return self.greenlet.switch() [ 846.355238] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 846.355238] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] self.f(*self.args, **self.kw) [ 846.355238] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 846.355238] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] raise exceptions.translate_fault(task_info.error) [ 846.355238] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 846.355238] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Faults: ['InvalidArgument'] [ 846.355238] env[65680]: ERROR nova.compute.manager [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] [ 846.355238] env[65680]: DEBUG nova.compute.utils [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] VimFaultException {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 846.356228] env[65680]: DEBUG nova.compute.manager [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Build of instance 059f5688-3497-40bd-bf18-9c0748f3bdd6 was re-scheduled: A specified parameter was not correct: fileType [ 846.356228] env[65680]: Faults: ['InvalidArgument'] {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 846.356607] env[65680]: DEBUG nova.compute.manager [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 846.356727] env[65680]: DEBUG nova.compute.manager [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 846.356899] env[65680]: DEBUG nova.compute.manager [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 846.357154] env[65680]: DEBUG nova.network.neutron [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 847.772047] env[65680]: DEBUG nova.network.neutron [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 847.792252] env[65680]: INFO nova.compute.manager [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Took 1.43 seconds to deallocate network for instance. [ 847.908203] env[65680]: INFO nova.scheduler.client.report [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Deleted allocations for instance 059f5688-3497-40bd-bf18-9c0748f3bdd6 [ 847.934716] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c47009df-2b3c-465c-96d1-81ae80881837 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Lock "059f5688-3497-40bd-bf18-9c0748f3bdd6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 291.024s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 847.940453] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "059f5688-3497-40bd-bf18-9c0748f3bdd6" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 289.460s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 847.940453] env[65680]: INFO nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] During sync_power_state the instance has a pending task (spawning). Skip. [ 847.940453] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "059f5688-3497-40bd-bf18-9c0748f3bdd6" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 847.940453] env[65680]: DEBUG oslo_concurrency.lockutils [None req-30fec49e-cc4a-4e2c-a448-a8aa1958bc2f tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Lock "059f5688-3497-40bd-bf18-9c0748f3bdd6" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 90.858s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 847.940632] env[65680]: DEBUG oslo_concurrency.lockutils [None req-30fec49e-cc4a-4e2c-a448-a8aa1958bc2f tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Acquiring lock "059f5688-3497-40bd-bf18-9c0748f3bdd6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 847.940632] env[65680]: DEBUG oslo_concurrency.lockutils [None req-30fec49e-cc4a-4e2c-a448-a8aa1958bc2f tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Lock "059f5688-3497-40bd-bf18-9c0748f3bdd6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 847.940632] env[65680]: DEBUG oslo_concurrency.lockutils [None req-30fec49e-cc4a-4e2c-a448-a8aa1958bc2f tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Lock "059f5688-3497-40bd-bf18-9c0748f3bdd6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 847.942795] env[65680]: INFO nova.compute.manager [None req-30fec49e-cc4a-4e2c-a448-a8aa1958bc2f tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Terminating instance [ 847.944937] env[65680]: DEBUG nova.compute.manager [None req-30fec49e-cc4a-4e2c-a448-a8aa1958bc2f tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 847.945492] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-30fec49e-cc4a-4e2c-a448-a8aa1958bc2f tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 847.945846] env[65680]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d2ef2492-098c-49e6-9b58-e52e493adba9 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.958686] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84aa6323-67d7-4a58-af7a-977c70895680 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.971037] env[65680]: DEBUG nova.compute.manager [None req-239cfc49-995f-461e-990f-bf1c33008cd1 tempest-ServersTestJSON-2079208081 tempest-ServersTestJSON-2079208081-project-member] [instance: 3c728886-a983-4071-a728-25d87770556f] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 847.991819] env[65680]: WARNING nova.virt.vmwareapi.vmops [None req-30fec49e-cc4a-4e2c-a448-a8aa1958bc2f tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 059f5688-3497-40bd-bf18-9c0748f3bdd6 could not be found. [ 847.992014] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-30fec49e-cc4a-4e2c-a448-a8aa1958bc2f tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 847.992268] env[65680]: INFO nova.compute.manager [None req-30fec49e-cc4a-4e2c-a448-a8aa1958bc2f tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Took 0.05 seconds to destroy the instance on the hypervisor. [ 847.993750] env[65680]: DEBUG oslo.service.loopingcall [None req-30fec49e-cc4a-4e2c-a448-a8aa1958bc2f tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 847.993750] env[65680]: DEBUG nova.compute.manager [-] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 847.993750] env[65680]: DEBUG nova.network.neutron [-] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 848.014859] env[65680]: DEBUG nova.compute.manager [None req-239cfc49-995f-461e-990f-bf1c33008cd1 tempest-ServersTestJSON-2079208081 tempest-ServersTestJSON-2079208081-project-member] [instance: 3c728886-a983-4071-a728-25d87770556f] Instance disappeared before build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 848.045790] env[65680]: DEBUG oslo_concurrency.lockutils [None req-239cfc49-995f-461e-990f-bf1c33008cd1 tempest-ServersTestJSON-2079208081 tempest-ServersTestJSON-2079208081-project-member] Lock "3c728886-a983-4071-a728-25d87770556f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.150s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.075055] env[65680]: DEBUG nova.compute.manager [None req-46c28720-73a0-4f0c-8d38-e23ea117d331 tempest-ImagesTestJSON-24601272 tempest-ImagesTestJSON-24601272-project-member] [instance: ba875739-2ff0-4778-89cf-5b32f2ffe6fd] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 848.123053] env[65680]: DEBUG nova.compute.manager [None req-46c28720-73a0-4f0c-8d38-e23ea117d331 tempest-ImagesTestJSON-24601272 tempest-ImagesTestJSON-24601272-project-member] [instance: ba875739-2ff0-4778-89cf-5b32f2ffe6fd] Instance disappeared before build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 848.163280] env[65680]: DEBUG oslo_concurrency.lockutils [None req-46c28720-73a0-4f0c-8d38-e23ea117d331 tempest-ImagesTestJSON-24601272 tempest-ImagesTestJSON-24601272-project-member] Lock "ba875739-2ff0-4778-89cf-5b32f2ffe6fd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.093s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.178855] env[65680]: DEBUG nova.compute.manager [None req-248a8c74-caef-4a03-b1d6-3a4e541c0911 tempest-ListServersNegativeTestJSON-1391608803 tempest-ListServersNegativeTestJSON-1391608803-project-member] [instance: d1f6ea52-4367-4756-88ff-37830ce1aeba] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 848.213428] env[65680]: DEBUG nova.compute.manager [None req-248a8c74-caef-4a03-b1d6-3a4e541c0911 tempest-ListServersNegativeTestJSON-1391608803 tempest-ListServersNegativeTestJSON-1391608803-project-member] [instance: d1f6ea52-4367-4756-88ff-37830ce1aeba] Instance disappeared before build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 848.244740] env[65680]: DEBUG oslo_concurrency.lockutils [None req-248a8c74-caef-4a03-b1d6-3a4e541c0911 tempest-ListServersNegativeTestJSON-1391608803 tempest-ListServersNegativeTestJSON-1391608803-project-member] Lock "d1f6ea52-4367-4756-88ff-37830ce1aeba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.621s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.258787] env[65680]: DEBUG nova.compute.manager [None req-248a8c74-caef-4a03-b1d6-3a4e541c0911 tempest-ListServersNegativeTestJSON-1391608803 tempest-ListServersNegativeTestJSON-1391608803-project-member] [instance: 11374639-ed45-4999-b8b9-fdbf08b9d8bd] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 848.266142] env[65680]: DEBUG nova.network.neutron [-] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 848.281223] env[65680]: INFO nova.compute.manager [-] [instance: 059f5688-3497-40bd-bf18-9c0748f3bdd6] Took 0.29 seconds to deallocate network for instance. [ 848.288858] env[65680]: DEBUG nova.compute.manager [None req-248a8c74-caef-4a03-b1d6-3a4e541c0911 tempest-ListServersNegativeTestJSON-1391608803 tempest-ListServersNegativeTestJSON-1391608803-project-member] [instance: 11374639-ed45-4999-b8b9-fdbf08b9d8bd] Instance disappeared before build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 848.317019] env[65680]: DEBUG oslo_concurrency.lockutils [None req-248a8c74-caef-4a03-b1d6-3a4e541c0911 tempest-ListServersNegativeTestJSON-1391608803 tempest-ListServersNegativeTestJSON-1391608803-project-member] Lock "11374639-ed45-4999-b8b9-fdbf08b9d8bd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.664s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.349911] env[65680]: DEBUG nova.compute.manager [None req-248a8c74-caef-4a03-b1d6-3a4e541c0911 tempest-ListServersNegativeTestJSON-1391608803 tempest-ListServersNegativeTestJSON-1391608803-project-member] [instance: a3876ce4-3e1d-4450-896c-b8321cc1a312] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 848.390436] env[65680]: DEBUG nova.compute.manager [None req-248a8c74-caef-4a03-b1d6-3a4e541c0911 tempest-ListServersNegativeTestJSON-1391608803 tempest-ListServersNegativeTestJSON-1391608803-project-member] [instance: a3876ce4-3e1d-4450-896c-b8321cc1a312] Instance disappeared before build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 848.424650] env[65680]: DEBUG oslo_concurrency.lockutils [None req-248a8c74-caef-4a03-b1d6-3a4e541c0911 tempest-ListServersNegativeTestJSON-1391608803 tempest-ListServersNegativeTestJSON-1391608803-project-member] Lock "a3876ce4-3e1d-4450-896c-b8321cc1a312" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.733s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.447015] env[65680]: DEBUG oslo_concurrency.lockutils [None req-30fec49e-cc4a-4e2c-a448-a8aa1958bc2f tempest-FloatingIPsAssociationNegativeTestJSON-1097254432 tempest-FloatingIPsAssociationNegativeTestJSON-1097254432-project-member] Lock "059f5688-3497-40bd-bf18-9c0748f3bdd6" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.508s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.449360] env[65680]: DEBUG nova.compute.manager [None req-6303a8ec-1583-4bc0-9dd9-67ad950255d0 tempest-ServersTestJSON-1264796280 tempest-ServersTestJSON-1264796280-project-member] [instance: b663e64f-77a5-4938-a492-df6f05bb182e] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 848.480648] env[65680]: DEBUG nova.compute.manager [None req-6303a8ec-1583-4bc0-9dd9-67ad950255d0 tempest-ServersTestJSON-1264796280 tempest-ServersTestJSON-1264796280-project-member] [instance: b663e64f-77a5-4938-a492-df6f05bb182e] Instance disappeared before build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 848.516518] env[65680]: DEBUG oslo_concurrency.lockutils [None req-6303a8ec-1583-4bc0-9dd9-67ad950255d0 tempest-ServersTestJSON-1264796280 tempest-ServersTestJSON-1264796280-project-member] Lock "b663e64f-77a5-4938-a492-df6f05bb182e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.392s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.550130] env[65680]: DEBUG nova.compute.manager [None req-e8bed0cd-990f-4042-aa34-f54cfec813f7 tempest-AttachInterfacesTestJSON-1874593521 tempest-AttachInterfacesTestJSON-1874593521-project-member] [instance: c9022be0-026f-4f6a-b720-162cadcd76bb] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 848.585100] env[65680]: DEBUG nova.compute.manager [None req-e8bed0cd-990f-4042-aa34-f54cfec813f7 tempest-AttachInterfacesTestJSON-1874593521 tempest-AttachInterfacesTestJSON-1874593521-project-member] [instance: c9022be0-026f-4f6a-b720-162cadcd76bb] Instance disappeared before build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 848.608865] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e8bed0cd-990f-4042-aa34-f54cfec813f7 tempest-AttachInterfacesTestJSON-1874593521 tempest-AttachInterfacesTestJSON-1874593521-project-member] Lock "c9022be0-026f-4f6a-b720-162cadcd76bb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.995s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.637470] env[65680]: DEBUG nova.compute.manager [None req-22330e79-fe29-411a-8c0c-30be455b8072 tempest-ServerGroupTestJSON-1851943177 tempest-ServerGroupTestJSON-1851943177-project-member] [instance: ce17e81b-0291-4309-8594-28ea20c530a9] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 848.666890] env[65680]: DEBUG nova.compute.manager [None req-22330e79-fe29-411a-8c0c-30be455b8072 tempest-ServerGroupTestJSON-1851943177 tempest-ServerGroupTestJSON-1851943177-project-member] [instance: ce17e81b-0291-4309-8594-28ea20c530a9] Instance disappeared before build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 848.691013] env[65680]: DEBUG oslo_concurrency.lockutils [None req-22330e79-fe29-411a-8c0c-30be455b8072 tempest-ServerGroupTestJSON-1851943177 tempest-ServerGroupTestJSON-1851943177-project-member] Lock "ce17e81b-0291-4309-8594-28ea20c530a9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.840s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.702721] env[65680]: DEBUG nova.compute.manager [None req-eba879e1-ec5d-45a6-a716-21597e3f78a5 tempest-ServersTestMultiNic-905135208 tempest-ServersTestMultiNic-905135208-project-member] [instance: 53010485-3888-4669-85b7-01381f0bffcd] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 848.729381] env[65680]: DEBUG nova.compute.manager [None req-eba879e1-ec5d-45a6-a716-21597e3f78a5 tempest-ServersTestMultiNic-905135208 tempest-ServersTestMultiNic-905135208-project-member] [instance: 53010485-3888-4669-85b7-01381f0bffcd] Instance disappeared before build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 848.757414] env[65680]: DEBUG oslo_concurrency.lockutils [None req-eba879e1-ec5d-45a6-a716-21597e3f78a5 tempest-ServersTestMultiNic-905135208 tempest-ServersTestMultiNic-905135208-project-member] Lock "53010485-3888-4669-85b7-01381f0bffcd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.505s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.771548] env[65680]: DEBUG nova.compute.manager [None req-38a99a56-39aa-4a84-a86d-e5cbeac7b8cf tempest-ServerRescueTestJSONUnderV235-332028835 tempest-ServerRescueTestJSONUnderV235-332028835-project-member] [instance: 017fddb0-49d5-434d-997d-126119a989ef] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 848.812280] env[65680]: DEBUG nova.compute.manager [None req-38a99a56-39aa-4a84-a86d-e5cbeac7b8cf tempest-ServerRescueTestJSONUnderV235-332028835 tempest-ServerRescueTestJSONUnderV235-332028835-project-member] [instance: 017fddb0-49d5-434d-997d-126119a989ef] Instance disappeared before build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 848.835832] env[65680]: DEBUG oslo_concurrency.lockutils [None req-38a99a56-39aa-4a84-a86d-e5cbeac7b8cf tempest-ServerRescueTestJSONUnderV235-332028835 tempest-ServerRescueTestJSONUnderV235-332028835-project-member] Lock "017fddb0-49d5-434d-997d-126119a989ef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.401s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.846600] env[65680]: DEBUG nova.compute.manager [None req-f37223fd-2de0-44a5-8ec5-d5bba1973458 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: d7b468db-115d-4b24-b604-5edb176dbf96] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 848.877121] env[65680]: DEBUG nova.compute.manager [None req-f37223fd-2de0-44a5-8ec5-d5bba1973458 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] [instance: d7b468db-115d-4b24-b604-5edb176dbf96] Instance disappeared before build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 848.905311] env[65680]: DEBUG oslo_concurrency.lockutils [None req-f37223fd-2de0-44a5-8ec5-d5bba1973458 tempest-DeleteServersAdminTestJSON-1739259118 tempest-DeleteServersAdminTestJSON-1739259118-project-member] Lock "d7b468db-115d-4b24-b604-5edb176dbf96" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.097s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.928078] env[65680]: DEBUG nova.compute.manager [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 848.996314] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 848.996545] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 848.998192] env[65680]: INFO nova.compute.claims [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 849.122848] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73a1ca34-5888-4094-a93b-76b3988b7786 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.130899] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0262f57-d8f4-4ee1-8c6d-fa53942a3622 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.172267] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b70efcd4-be63-49d1-92be-e3fdb1112fa8 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.182593] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf9b404f-8a87-48cc-8d8b-fb5b3373032f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.198604] env[65680]: DEBUG nova.compute.provider_tree [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 849.208092] env[65680]: DEBUG nova.scheduler.client.report [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 849.223802] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.227s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 849.224347] env[65680]: DEBUG nova.compute.manager [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 849.263186] env[65680]: DEBUG nova.compute.utils [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 849.264517] env[65680]: DEBUG nova.compute.manager [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 849.264730] env[65680]: DEBUG nova.network.neutron [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 849.278123] env[65680]: DEBUG nova.compute.manager [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 849.350593] env[65680]: DEBUG nova.compute.manager [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 849.375159] env[65680]: DEBUG nova.virt.hardware [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 849.375159] env[65680]: DEBUG nova.virt.hardware [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 849.375159] env[65680]: DEBUG nova.virt.hardware [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 849.375378] env[65680]: DEBUG nova.virt.hardware [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 849.375378] env[65680]: DEBUG nova.virt.hardware [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 849.375378] env[65680]: DEBUG nova.virt.hardware [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 849.375764] env[65680]: DEBUG nova.virt.hardware [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 849.376240] env[65680]: DEBUG nova.virt.hardware [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 849.376560] env[65680]: DEBUG nova.virt.hardware [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 849.376842] env[65680]: DEBUG nova.virt.hardware [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 849.377255] env[65680]: DEBUG nova.virt.hardware [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 849.378225] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca5ea607-c5b4-42a9-b62e-e8ecfc115f24 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.388287] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1c74656-ebe7-44d2-98ee-60bd8b5fe742 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.501036] env[65680]: DEBUG nova.policy [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4a6c6c625d134b5997fb46045a7e2ce4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3c0dc2bfba4645f4b23ce998fb7f33ad', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 851.173115] env[65680]: DEBUG nova.network.neutron [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Successfully created port: 6759a487-7d1a-43df-bc0a-d98cd9572dc1 {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 853.953047] env[65680]: DEBUG nova.compute.manager [req-5be32b3c-5b29-4b78-86a7-a36873bb1ca3 req-35deded9-9334-49d9-8963-83aa3233bf94 service nova] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Received event network-vif-plugged-6759a487-7d1a-43df-bc0a-d98cd9572dc1 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 853.953328] env[65680]: DEBUG oslo_concurrency.lockutils [req-5be32b3c-5b29-4b78-86a7-a36873bb1ca3 req-35deded9-9334-49d9-8963-83aa3233bf94 service nova] Acquiring lock "e5d6d263-463e-46b8-9bb3-d10a4101d4e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 853.953428] env[65680]: DEBUG oslo_concurrency.lockutils [req-5be32b3c-5b29-4b78-86a7-a36873bb1ca3 req-35deded9-9334-49d9-8963-83aa3233bf94 service nova] Lock "e5d6d263-463e-46b8-9bb3-d10a4101d4e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 853.953580] env[65680]: DEBUG oslo_concurrency.lockutils [req-5be32b3c-5b29-4b78-86a7-a36873bb1ca3 req-35deded9-9334-49d9-8963-83aa3233bf94 service nova] Lock "e5d6d263-463e-46b8-9bb3-d10a4101d4e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 853.953740] env[65680]: DEBUG nova.compute.manager [req-5be32b3c-5b29-4b78-86a7-a36873bb1ca3 req-35deded9-9334-49d9-8963-83aa3233bf94 service nova] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] No waiting events found dispatching network-vif-plugged-6759a487-7d1a-43df-bc0a-d98cd9572dc1 {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 853.953929] env[65680]: WARNING nova.compute.manager [req-5be32b3c-5b29-4b78-86a7-a36873bb1ca3 req-35deded9-9334-49d9-8963-83aa3233bf94 service nova] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Received unexpected event network-vif-plugged-6759a487-7d1a-43df-bc0a-d98cd9572dc1 for instance with vm_state building and task_state spawning. [ 854.500731] env[65680]: DEBUG nova.network.neutron [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Successfully updated port: 6759a487-7d1a-43df-bc0a-d98cd9572dc1 {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 854.516222] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Acquiring lock "refresh_cache-e5d6d263-463e-46b8-9bb3-d10a4101d4e0" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 854.516350] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Acquired lock "refresh_cache-e5d6d263-463e-46b8-9bb3-d10a4101d4e0" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 854.516459] env[65680]: DEBUG nova.network.neutron [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 854.705592] env[65680]: DEBUG nova.network.neutron [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 855.974918] env[65680]: DEBUG nova.network.neutron [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Updating instance_info_cache with network_info: [{"id": "6759a487-7d1a-43df-bc0a-d98cd9572dc1", "address": "fa:16:3e:80:f1:2d", "network": {"id": "fe302b8b-eac9-4bc6-9b92-3f99951776cd", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1004207439-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3c0dc2bfba4645f4b23ce998fb7f33ad", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5fdd0624-2edb-4733-8284-225815c07f73", "external-id": "nsx-vlan-transportzone-330", "segmentation_id": 330, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6759a487-7d", "ovs_interfaceid": "6759a487-7d1a-43df-bc0a-d98cd9572dc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 855.991836] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Releasing lock "refresh_cache-e5d6d263-463e-46b8-9bb3-d10a4101d4e0" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 855.992193] env[65680]: DEBUG nova.compute.manager [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Instance network_info: |[{"id": "6759a487-7d1a-43df-bc0a-d98cd9572dc1", "address": "fa:16:3e:80:f1:2d", "network": {"id": "fe302b8b-eac9-4bc6-9b92-3f99951776cd", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1004207439-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3c0dc2bfba4645f4b23ce998fb7f33ad", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5fdd0624-2edb-4733-8284-225815c07f73", "external-id": "nsx-vlan-transportzone-330", "segmentation_id": 330, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6759a487-7d", "ovs_interfaceid": "6759a487-7d1a-43df-bc0a-d98cd9572dc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 855.993398] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:80:f1:2d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5fdd0624-2edb-4733-8284-225815c07f73', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6759a487-7d1a-43df-bc0a-d98cd9572dc1', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 856.005793] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Creating folder: Project (3c0dc2bfba4645f4b23ce998fb7f33ad). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 856.006486] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8c4545aa-ebde-46af-b5e7-e522f4ad9bf2 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 856.017168] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Created folder: Project (3c0dc2bfba4645f4b23ce998fb7f33ad) in parent group-v572532. [ 856.017463] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Creating folder: Instances. Parent ref: group-v572586. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 856.017709] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-84c54378-94a2-401e-97f7-860d09b6631a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 856.028037] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Created folder: Instances in parent group-v572586. [ 856.028210] env[65680]: DEBUG oslo.service.loopingcall [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 856.028640] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 856.028640] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-fd76bb15-1e27-4f51-8bf9-1047b4535baa {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 856.049851] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 856.049851] env[65680]: value = "task-2847910" [ 856.049851] env[65680]: _type = "Task" [ 856.049851] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 856.056910] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847910, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 856.559664] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847910, 'name': CreateVM_Task, 'duration_secs': 0.363641} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 856.559850] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 856.560749] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 856.560915] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 856.561262] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 856.561514] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8b538b50-fedd-48a2-afe2-03bfbf1c73e2 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 856.569313] env[65680]: DEBUG oslo_vmware.api [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Waiting for the task: (returnval){ [ 856.569313] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]520829f3-68de-8c57-db37-1cec4d3f1796" [ 856.569313] env[65680]: _type = "Task" [ 856.569313] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 856.577638] env[65680]: DEBUG oslo_vmware.api [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]520829f3-68de-8c57-db37-1cec4d3f1796, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 856.594815] env[65680]: DEBUG nova.compute.manager [req-c8635d5c-bb47-455a-b6cd-0baa834665d5 req-c26568e4-7ec4-4b98-ab1c-190791a73c80 service nova] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Received event network-changed-6759a487-7d1a-43df-bc0a-d98cd9572dc1 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 856.595012] env[65680]: DEBUG nova.compute.manager [req-c8635d5c-bb47-455a-b6cd-0baa834665d5 req-c26568e4-7ec4-4b98-ab1c-190791a73c80 service nova] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Refreshing instance network info cache due to event network-changed-6759a487-7d1a-43df-bc0a-d98cd9572dc1. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 856.595233] env[65680]: DEBUG oslo_concurrency.lockutils [req-c8635d5c-bb47-455a-b6cd-0baa834665d5 req-c26568e4-7ec4-4b98-ab1c-190791a73c80 service nova] Acquiring lock "refresh_cache-e5d6d263-463e-46b8-9bb3-d10a4101d4e0" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 856.595376] env[65680]: DEBUG oslo_concurrency.lockutils [req-c8635d5c-bb47-455a-b6cd-0baa834665d5 req-c26568e4-7ec4-4b98-ab1c-190791a73c80 service nova] Acquired lock "refresh_cache-e5d6d263-463e-46b8-9bb3-d10a4101d4e0" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 856.595533] env[65680]: DEBUG nova.network.neutron [req-c8635d5c-bb47-455a-b6cd-0baa834665d5 req-c26568e4-7ec4-4b98-ab1c-190791a73c80 service nova] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Refreshing network info cache for port 6759a487-7d1a-43df-bc0a-d98cd9572dc1 {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 857.084247] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 857.084906] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 857.085444] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 858.410731] env[65680]: DEBUG nova.network.neutron [req-c8635d5c-bb47-455a-b6cd-0baa834665d5 req-c26568e4-7ec4-4b98-ab1c-190791a73c80 service nova] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Updated VIF entry in instance network info cache for port 6759a487-7d1a-43df-bc0a-d98cd9572dc1. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 858.410731] env[65680]: DEBUG nova.network.neutron [req-c8635d5c-bb47-455a-b6cd-0baa834665d5 req-c26568e4-7ec4-4b98-ab1c-190791a73c80 service nova] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Updating instance_info_cache with network_info: [{"id": "6759a487-7d1a-43df-bc0a-d98cd9572dc1", "address": "fa:16:3e:80:f1:2d", "network": {"id": "fe302b8b-eac9-4bc6-9b92-3f99951776cd", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1004207439-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3c0dc2bfba4645f4b23ce998fb7f33ad", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5fdd0624-2edb-4733-8284-225815c07f73", "external-id": "nsx-vlan-transportzone-330", "segmentation_id": 330, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6759a487-7d", "ovs_interfaceid": "6759a487-7d1a-43df-bc0a-d98cd9572dc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 858.419417] env[65680]: DEBUG oslo_concurrency.lockutils [req-c8635d5c-bb47-455a-b6cd-0baa834665d5 req-c26568e4-7ec4-4b98-ab1c-190791a73c80 service nova] Releasing lock "refresh_cache-e5d6d263-463e-46b8-9bb3-d10a4101d4e0" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 864.293596] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 864.293942] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Starting heal instance info cache {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 864.293942] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Rebuilding the list of instances to heal {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 864.307903] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 864.308080] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 864.308385] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 864.308385] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Didn't find any instances for network info cache update. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 864.308770] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 865.303611] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 866.293015] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 867.293433] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 868.293666] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 868.293666] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 868.293666] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65680) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 868.294280] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager.update_available_resource {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 868.309435] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 868.309670] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 868.309839] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 868.309999] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65680) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 868.314018] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a2360e8-29c3-4ee7-a9a6-fa862ed58991 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 868.320757] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f550f7a1-cc42-4adc-9d41-baaacfc41188 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 868.335538] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee8f2d64-03a3-4221-b16e-6788d47abaef {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 868.342225] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecb5f507-fd50-4657-9268-51d4bad0f96d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 868.372976] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181075MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65680) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 868.373252] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 868.373490] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 868.434991] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance fc14c935-fe84-4a49-ac1e-575e56b672a3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 868.435202] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance acbe2170-7ce3-4820-b082-6680e559bde1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 868.435721] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance e5d6d263-463e-46b8-9bb3-d10a4101d4e0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 868.435939] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Total usable vcpus: 48, total allocated vcpus: 3 {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 868.436102] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=896MB phys_disk=200GB used_disk=3GB total_vcpus=48 used_vcpus=3 pci_stats=[] {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 868.519585] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19cf6ae9-accd-4b03-8805-3bcce6773439 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 868.528991] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-adaa2464-7dfd-4e37-a0a3-264706579220 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 868.566486] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e33206b-739b-41de-815f-1d45bdc6fd71 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 868.576418] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6e296f7-c176-4db4-bee5-55f2ceee1cc6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 868.591292] env[65680]: DEBUG nova.compute.provider_tree [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 868.607995] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 868.624362] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65680) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 868.624643] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.251s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 870.620125] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 871.293804] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 873.051689] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquiring lock "abb69e61-9594-48b5-b3f4-f8ba39f93f0e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 873.051910] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "abb69e61-9594-48b5-b3f4-f8ba39f93f0e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 874.380137] env[65680]: DEBUG oslo_concurrency.lockutils [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Acquiring lock "05ef6eca-eb64-43b3-8c7d-b5a230282a8f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 874.380390] env[65680]: DEBUG oslo_concurrency.lockutils [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Lock "05ef6eca-eb64-43b3-8c7d-b5a230282a8f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 874.896134] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Acquiring lock "8b747838-fcd0-494c-bd5a-0e5b1950a44e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 874.896134] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Lock "8b747838-fcd0-494c-bd5a-0e5b1950a44e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 874.921709] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Acquiring lock "01e82211-1de5-44ad-b14e-81a54470d4e5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 874.922186] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Lock "01e82211-1de5-44ad-b14e-81a54470d4e5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 875.534437] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Acquiring lock "dd382edd-abe8-4764-a9d5-4144ef7d50b0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 875.534691] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Lock "dd382edd-abe8-4764-a9d5-4144ef7d50b0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 879.284328] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Acquiring lock "c9230f1c-72ea-4f62-be9f-949def49c5f4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 879.284328] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Lock "c9230f1c-72ea-4f62-be9f-949def49c5f4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 880.033990] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Acquiring lock "132e6039-55dc-4118-bcd5-d32557743981" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 880.034366] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Lock "132e6039-55dc-4118-bcd5-d32557743981" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 891.703014] env[65680]: WARNING oslo_vmware.rw_handles [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 891.703014] env[65680]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 891.703014] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 891.703014] env[65680]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 891.703014] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 891.703014] env[65680]: ERROR oslo_vmware.rw_handles response.begin() [ 891.703014] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 891.703014] env[65680]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 891.703014] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 891.703014] env[65680]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 891.703014] env[65680]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 891.703014] env[65680]: ERROR oslo_vmware.rw_handles [ 891.703760] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Downloaded image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to vmware_temp/8e7db4c2-608d-40da-8973-ab83a01365fb/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 891.705241] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Caching image {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 891.705491] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Copying Virtual Disk [datastore1] vmware_temp/8e7db4c2-608d-40da-8973-ab83a01365fb/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk to [datastore1] vmware_temp/8e7db4c2-608d-40da-8973-ab83a01365fb/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk {{(pid=65680) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 891.705771] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f836697a-3e75-4ce0-a37d-8e9e2d311845 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 891.713323] env[65680]: DEBUG oslo_vmware.api [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Waiting for the task: (returnval){ [ 891.713323] env[65680]: value = "task-2847911" [ 891.713323] env[65680]: _type = "Task" [ 891.713323] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 891.720840] env[65680]: DEBUG oslo_vmware.api [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Task: {'id': task-2847911, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 892.225060] env[65680]: DEBUG oslo_vmware.exceptions [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Fault InvalidArgument not matched. {{(pid=65680) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 892.225060] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 892.225060] env[65680]: ERROR nova.compute.manager [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 892.225060] env[65680]: Faults: ['InvalidArgument'] [ 892.225060] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Traceback (most recent call last): [ 892.225060] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 892.225060] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] yield resources [ 892.225060] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 892.225060] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] self.driver.spawn(context, instance, image_meta, [ 892.225417] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 892.225417] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 892.225417] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 892.225417] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] self._fetch_image_if_missing(context, vi) [ 892.225417] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 892.225417] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] image_cache(vi, tmp_image_ds_loc) [ 892.225417] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 892.225417] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] vm_util.copy_virtual_disk( [ 892.225417] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 892.225417] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] session._wait_for_task(vmdk_copy_task) [ 892.225417] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 892.225417] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] return self.wait_for_task(task_ref) [ 892.225417] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 892.225789] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] return evt.wait() [ 892.225789] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 892.225789] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] result = hub.switch() [ 892.225789] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 892.225789] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] return self.greenlet.switch() [ 892.225789] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 892.225789] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] self.f(*self.args, **self.kw) [ 892.225789] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 892.225789] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] raise exceptions.translate_fault(task_info.error) [ 892.225789] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 892.225789] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Faults: ['InvalidArgument'] [ 892.225789] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] [ 892.225789] env[65680]: INFO nova.compute.manager [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Terminating instance [ 892.226981] env[65680]: DEBUG oslo_concurrency.lockutils [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 892.227202] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 892.227429] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ee505af1-e9a8-47e4-90c6-ae7ad2a9829c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 892.229750] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Acquiring lock "refresh_cache-fc14c935-fe84-4a49-ac1e-575e56b672a3" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 892.229819] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Acquired lock "refresh_cache-fc14c935-fe84-4a49-ac1e-575e56b672a3" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 892.229950] env[65680]: DEBUG nova.network.neutron [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 892.236972] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 892.237165] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 892.238326] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-19873450-5ae1-406a-a03c-3d8a903fc86d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 892.245557] env[65680]: DEBUG oslo_vmware.api [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Waiting for the task: (returnval){ [ 892.245557] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]5238f9f5-e13f-eb8f-4562-8ac1148d86ed" [ 892.245557] env[65680]: _type = "Task" [ 892.245557] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 892.252968] env[65680]: DEBUG oslo_vmware.api [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]5238f9f5-e13f-eb8f-4562-8ac1148d86ed, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 892.257050] env[65680]: DEBUG nova.network.neutron [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 892.393992] env[65680]: DEBUG nova.network.neutron [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 892.405879] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Releasing lock "refresh_cache-fc14c935-fe84-4a49-ac1e-575e56b672a3" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 892.406312] env[65680]: DEBUG nova.compute.manager [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 892.406503] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 892.407525] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0313afea-74b0-482f-bfce-6b03e772fba9 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 892.415314] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 892.415536] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7edd4d56-08f0-4e54-8922-03aee534dfa7 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 892.442483] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 892.442739] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 892.442924] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Deleting the datastore file [datastore1] fc14c935-fe84-4a49-ac1e-575e56b672a3 {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 892.443182] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-21318c9b-88ae-4d2c-8ace-df67fe0115bc {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 892.450355] env[65680]: DEBUG oslo_vmware.api [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Waiting for the task: (returnval){ [ 892.450355] env[65680]: value = "task-2847913" [ 892.450355] env[65680]: _type = "Task" [ 892.450355] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 892.457633] env[65680]: DEBUG oslo_vmware.api [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Task: {'id': task-2847913, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 892.755589] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 892.755905] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Creating directory with path [datastore1] vmware_temp/812b11fb-b4ad-4c97-bec4-717da09071a0/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 892.756054] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-33017e27-26cd-438a-ae61-ed41de961042 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 892.766487] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Created directory with path [datastore1] vmware_temp/812b11fb-b4ad-4c97-bec4-717da09071a0/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 892.766673] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Fetch image to [datastore1] vmware_temp/812b11fb-b4ad-4c97-bec4-717da09071a0/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 892.766826] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/812b11fb-b4ad-4c97-bec4-717da09071a0/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 892.767516] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17b8f0c9-6656-4e1d-a21b-0a6e8b8053b1 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 892.775054] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-797736d0-3ecf-40cf-9def-5187d9240269 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 892.783533] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b76b00b9-2cb9-4c48-bf2b-6067aafed17b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 892.813143] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac668cd0-fa97-45e3-8400-32a3df805170 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 892.818337] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5bd2a608-3134-4fae-804e-a98267b1afa4 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 892.837511] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 892.880933] env[65680]: DEBUG oslo_vmware.rw_handles [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/812b11fb-b4ad-4c97-bec4-717da09071a0/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 892.938454] env[65680]: DEBUG oslo_vmware.rw_handles [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Completed reading data from the image iterator. {{(pid=65680) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 892.938679] env[65680]: DEBUG oslo_vmware.rw_handles [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/812b11fb-b4ad-4c97-bec4-717da09071a0/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 892.959980] env[65680]: DEBUG oslo_vmware.api [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Task: {'id': task-2847913, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.042758} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 892.960289] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 892.960551] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 892.960706] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 892.960902] env[65680]: INFO nova.compute.manager [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Took 0.55 seconds to destroy the instance on the hypervisor. [ 892.961177] env[65680]: DEBUG oslo.service.loopingcall [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 892.961393] env[65680]: DEBUG nova.compute.manager [-] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Skipping network deallocation for instance since networking was not requested. {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 892.963610] env[65680]: DEBUG nova.compute.claims [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 892.963767] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 892.963976] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 893.111654] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c9e7abf-64f8-4a20-8b24-3d8e847b899e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 893.118753] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c96f09d-add1-4afe-8f9a-2ba7440c96e1 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 893.148394] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-496fa099-ef73-414b-8ce3-17120a6e8b88 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 893.155014] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-829e9a30-4d6c-499f-b956-d99f6442fc7f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 893.167615] env[65680]: DEBUG nova.compute.provider_tree [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 893.176165] env[65680]: DEBUG nova.scheduler.client.report [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 893.191688] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.228s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 893.192199] env[65680]: ERROR nova.compute.manager [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 893.192199] env[65680]: Faults: ['InvalidArgument'] [ 893.192199] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Traceback (most recent call last): [ 893.192199] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 893.192199] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] self.driver.spawn(context, instance, image_meta, [ 893.192199] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 893.192199] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 893.192199] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 893.192199] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] self._fetch_image_if_missing(context, vi) [ 893.192199] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 893.192199] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] image_cache(vi, tmp_image_ds_loc) [ 893.192199] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 893.192561] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] vm_util.copy_virtual_disk( [ 893.192561] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 893.192561] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] session._wait_for_task(vmdk_copy_task) [ 893.192561] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 893.192561] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] return self.wait_for_task(task_ref) [ 893.192561] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 893.192561] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] return evt.wait() [ 893.192561] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 893.192561] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] result = hub.switch() [ 893.192561] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 893.192561] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] return self.greenlet.switch() [ 893.192561] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 893.192561] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] self.f(*self.args, **self.kw) [ 893.192894] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 893.192894] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] raise exceptions.translate_fault(task_info.error) [ 893.192894] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 893.192894] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Faults: ['InvalidArgument'] [ 893.192894] env[65680]: ERROR nova.compute.manager [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] [ 893.193040] env[65680]: DEBUG nova.compute.utils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] VimFaultException {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 893.194337] env[65680]: DEBUG nova.compute.manager [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Build of instance fc14c935-fe84-4a49-ac1e-575e56b672a3 was re-scheduled: A specified parameter was not correct: fileType [ 893.194337] env[65680]: Faults: ['InvalidArgument'] {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 893.194722] env[65680]: DEBUG nova.compute.manager [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 893.194960] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Acquiring lock "refresh_cache-fc14c935-fe84-4a49-ac1e-575e56b672a3" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 893.195125] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Acquired lock "refresh_cache-fc14c935-fe84-4a49-ac1e-575e56b672a3" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 893.195286] env[65680]: DEBUG nova.network.neutron [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 893.218124] env[65680]: DEBUG nova.network.neutron [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 893.275963] env[65680]: DEBUG nova.network.neutron [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 893.284109] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Releasing lock "refresh_cache-fc14c935-fe84-4a49-ac1e-575e56b672a3" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 893.284324] env[65680]: DEBUG nova.compute.manager [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 893.284505] env[65680]: DEBUG nova.compute.manager [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Skipping network deallocation for instance since networking was not requested. {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 893.364610] env[65680]: INFO nova.scheduler.client.report [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Deleted allocations for instance fc14c935-fe84-4a49-ac1e-575e56b672a3 [ 893.378364] env[65680]: DEBUG oslo_concurrency.lockutils [None req-5f26dcf3-c581-462b-bf99-104fb9785f62 tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Lock "fc14c935-fe84-4a49-ac1e-575e56b672a3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 329.650s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 893.379322] env[65680]: DEBUG oslo_concurrency.lockutils [None req-9b922bd1-9c81-43be-a749-982d865f3b8f tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Lock "fc14c935-fe84-4a49-ac1e-575e56b672a3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 128.825s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 893.379541] env[65680]: DEBUG oslo_concurrency.lockutils [None req-9b922bd1-9c81-43be-a749-982d865f3b8f tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Acquiring lock "fc14c935-fe84-4a49-ac1e-575e56b672a3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 893.379756] env[65680]: DEBUG oslo_concurrency.lockutils [None req-9b922bd1-9c81-43be-a749-982d865f3b8f tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Lock "fc14c935-fe84-4a49-ac1e-575e56b672a3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 893.379948] env[65680]: DEBUG oslo_concurrency.lockutils [None req-9b922bd1-9c81-43be-a749-982d865f3b8f tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Lock "fc14c935-fe84-4a49-ac1e-575e56b672a3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 893.381741] env[65680]: INFO nova.compute.manager [None req-9b922bd1-9c81-43be-a749-982d865f3b8f tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Terminating instance [ 893.383264] env[65680]: DEBUG oslo_concurrency.lockutils [None req-9b922bd1-9c81-43be-a749-982d865f3b8f tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Acquiring lock "refresh_cache-fc14c935-fe84-4a49-ac1e-575e56b672a3" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 893.383418] env[65680]: DEBUG oslo_concurrency.lockutils [None req-9b922bd1-9c81-43be-a749-982d865f3b8f tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Acquired lock "refresh_cache-fc14c935-fe84-4a49-ac1e-575e56b672a3" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 893.383581] env[65680]: DEBUG nova.network.neutron [None req-9b922bd1-9c81-43be-a749-982d865f3b8f tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 893.395729] env[65680]: DEBUG nova.compute.manager [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 893.405601] env[65680]: DEBUG nova.network.neutron [None req-9b922bd1-9c81-43be-a749-982d865f3b8f tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 893.448123] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 893.448436] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 893.449899] env[65680]: INFO nova.compute.claims [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 893.484515] env[65680]: DEBUG nova.network.neutron [None req-9b922bd1-9c81-43be-a749-982d865f3b8f tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 893.494204] env[65680]: DEBUG oslo_concurrency.lockutils [None req-9b922bd1-9c81-43be-a749-982d865f3b8f tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Releasing lock "refresh_cache-fc14c935-fe84-4a49-ac1e-575e56b672a3" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 893.494204] env[65680]: DEBUG nova.compute.manager [None req-9b922bd1-9c81-43be-a749-982d865f3b8f tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 893.494344] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-9b922bd1-9c81-43be-a749-982d865f3b8f tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 893.494762] env[65680]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b481cf7e-e5c1-4e77-95a1-707cbe24d5f9 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 893.503397] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-681b3fdb-e886-4562-8c51-1bfc084c9c13 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 893.533876] env[65680]: WARNING nova.virt.vmwareapi.vmops [None req-9b922bd1-9c81-43be-a749-982d865f3b8f tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fc14c935-fe84-4a49-ac1e-575e56b672a3 could not be found. [ 893.534107] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-9b922bd1-9c81-43be-a749-982d865f3b8f tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 893.534276] env[65680]: INFO nova.compute.manager [None req-9b922bd1-9c81-43be-a749-982d865f3b8f tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Took 0.04 seconds to destroy the instance on the hypervisor. [ 893.534507] env[65680]: DEBUG oslo.service.loopingcall [None req-9b922bd1-9c81-43be-a749-982d865f3b8f tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 893.537019] env[65680]: DEBUG nova.compute.manager [-] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 893.537138] env[65680]: DEBUG nova.network.neutron [-] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 893.554781] env[65680]: DEBUG nova.network.neutron [-] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 893.562116] env[65680]: DEBUG nova.network.neutron [-] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 893.572832] env[65680]: INFO nova.compute.manager [-] [instance: fc14c935-fe84-4a49-ac1e-575e56b672a3] Took 0.04 seconds to deallocate network for instance. [ 893.623369] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c71f7bcd-9174-483d-83af-47df3bfe25ad {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 893.630566] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5b78c6e-df43-47f9-9825-bfe70d80c916 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 893.662289] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a0f94d1-dbdb-4001-81af-08efcc46585a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 893.664998] env[65680]: DEBUG oslo_concurrency.lockutils [None req-9b922bd1-9c81-43be-a749-982d865f3b8f tempest-ServersAdmin275Test-1110301119 tempest-ServersAdmin275Test-1110301119-project-member] Lock "fc14c935-fe84-4a49-ac1e-575e56b672a3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.286s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 893.672031] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-263b0a9f-2c59-4bd2-aacf-f7af7cad6b98 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 893.685289] env[65680]: DEBUG nova.compute.provider_tree [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 893.692780] env[65680]: DEBUG nova.scheduler.client.report [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 893.704333] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.256s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 893.704782] env[65680]: DEBUG nova.compute.manager [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 893.734705] env[65680]: DEBUG nova.compute.utils [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 893.735777] env[65680]: DEBUG nova.compute.manager [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 893.735945] env[65680]: DEBUG nova.network.neutron [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 893.744529] env[65680]: DEBUG nova.compute.manager [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 893.790151] env[65680]: DEBUG nova.policy [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0c0a4078f7644f57884a39d3369ceb7e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4ec0d6e13ecf4b72b79052a4077a754f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 893.806145] env[65680]: DEBUG nova.compute.manager [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 893.828815] env[65680]: DEBUG nova.virt.hardware [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 893.829035] env[65680]: DEBUG nova.virt.hardware [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 893.829188] env[65680]: DEBUG nova.virt.hardware [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 893.829367] env[65680]: DEBUG nova.virt.hardware [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 893.829505] env[65680]: DEBUG nova.virt.hardware [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 893.829807] env[65680]: DEBUG nova.virt.hardware [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 893.829943] env[65680]: DEBUG nova.virt.hardware [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 893.831088] env[65680]: DEBUG nova.virt.hardware [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 893.831088] env[65680]: DEBUG nova.virt.hardware [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 893.831088] env[65680]: DEBUG nova.virt.hardware [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 893.831088] env[65680]: DEBUG nova.virt.hardware [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 893.831481] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a780ac73-c7d7-4678-b8b5-fed3c4b9bfae {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 893.839489] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72ec9641-33d0-4733-8bc6-58a48e395160 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 894.089783] env[65680]: DEBUG nova.network.neutron [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Successfully created port: 93599a65-fa65-48cd-80fe-0d734baa05b1 {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 894.718566] env[65680]: DEBUG nova.compute.manager [req-e2974957-7dad-42fb-bcbd-dc01552901fe req-140f1104-ef78-4f43-8e1f-653a0cee9d6f service nova] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Received event network-vif-plugged-93599a65-fa65-48cd-80fe-0d734baa05b1 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 894.718921] env[65680]: DEBUG oslo_concurrency.lockutils [req-e2974957-7dad-42fb-bcbd-dc01552901fe req-140f1104-ef78-4f43-8e1f-653a0cee9d6f service nova] Acquiring lock "abb69e61-9594-48b5-b3f4-f8ba39f93f0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 894.719213] env[65680]: DEBUG oslo_concurrency.lockutils [req-e2974957-7dad-42fb-bcbd-dc01552901fe req-140f1104-ef78-4f43-8e1f-653a0cee9d6f service nova] Lock "abb69e61-9594-48b5-b3f4-f8ba39f93f0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 894.720725] env[65680]: DEBUG oslo_concurrency.lockutils [req-e2974957-7dad-42fb-bcbd-dc01552901fe req-140f1104-ef78-4f43-8e1f-653a0cee9d6f service nova] Lock "abb69e61-9594-48b5-b3f4-f8ba39f93f0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 894.720725] env[65680]: DEBUG nova.compute.manager [req-e2974957-7dad-42fb-bcbd-dc01552901fe req-140f1104-ef78-4f43-8e1f-653a0cee9d6f service nova] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] No waiting events found dispatching network-vif-plugged-93599a65-fa65-48cd-80fe-0d734baa05b1 {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 894.720725] env[65680]: WARNING nova.compute.manager [req-e2974957-7dad-42fb-bcbd-dc01552901fe req-140f1104-ef78-4f43-8e1f-653a0cee9d6f service nova] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Received unexpected event network-vif-plugged-93599a65-fa65-48cd-80fe-0d734baa05b1 for instance with vm_state building and task_state spawning. [ 894.757302] env[65680]: DEBUG nova.network.neutron [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Successfully updated port: 93599a65-fa65-48cd-80fe-0d734baa05b1 {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 894.768215] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquiring lock "refresh_cache-abb69e61-9594-48b5-b3f4-f8ba39f93f0e" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 894.768215] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquired lock "refresh_cache-abb69e61-9594-48b5-b3f4-f8ba39f93f0e" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 894.768215] env[65680]: DEBUG nova.network.neutron [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 894.812636] env[65680]: DEBUG nova.network.neutron [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 894.973488] env[65680]: DEBUG nova.network.neutron [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Updating instance_info_cache with network_info: [{"id": "93599a65-fa65-48cd-80fe-0d734baa05b1", "address": "fa:16:3e:dc:26:8c", "network": {"id": "88155a8f-de41-4a3d-8802-3a6c5765e431", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-792625309-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ec0d6e13ecf4b72b79052a4077a754f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a06a63d6-2aeb-4084-8022-f804cac3fa74", "external-id": "nsx-vlan-transportzone-797", "segmentation_id": 797, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap93599a65-fa", "ovs_interfaceid": "93599a65-fa65-48cd-80fe-0d734baa05b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 894.985053] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Releasing lock "refresh_cache-abb69e61-9594-48b5-b3f4-f8ba39f93f0e" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 894.985384] env[65680]: DEBUG nova.compute.manager [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Instance network_info: |[{"id": "93599a65-fa65-48cd-80fe-0d734baa05b1", "address": "fa:16:3e:dc:26:8c", "network": {"id": "88155a8f-de41-4a3d-8802-3a6c5765e431", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-792625309-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ec0d6e13ecf4b72b79052a4077a754f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a06a63d6-2aeb-4084-8022-f804cac3fa74", "external-id": "nsx-vlan-transportzone-797", "segmentation_id": 797, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap93599a65-fa", "ovs_interfaceid": "93599a65-fa65-48cd-80fe-0d734baa05b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 894.985798] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:dc:26:8c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a06a63d6-2aeb-4084-8022-f804cac3fa74', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '93599a65-fa65-48cd-80fe-0d734baa05b1', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 894.994027] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Creating folder: Project (4ec0d6e13ecf4b72b79052a4077a754f). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 894.994027] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fb60f72f-54d7-48e3-a290-6d9260645cc6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 895.004382] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Created folder: Project (4ec0d6e13ecf4b72b79052a4077a754f) in parent group-v572532. [ 895.004560] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Creating folder: Instances. Parent ref: group-v572589. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 895.004781] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cbc6142b-5df5-40df-98f6-811cc13e6a68 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 895.015210] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Created folder: Instances in parent group-v572589. [ 895.015430] env[65680]: DEBUG oslo.service.loopingcall [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 895.015613] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 895.015804] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1b21c3a6-3787-41d3-9c9d-a7f00e6f4b3f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 895.035741] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 895.035741] env[65680]: value = "task-2847916" [ 895.035741] env[65680]: _type = "Task" [ 895.035741] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 895.043303] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847916, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 895.545893] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847916, 'name': CreateVM_Task, 'duration_secs': 0.28636} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 895.546097] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 895.546776] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 895.546938] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 895.547276] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 895.547534] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-53956258-0f90-4631-b317-c8b1c80d88a1 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 895.553265] env[65680]: DEBUG oslo_vmware.api [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Waiting for the task: (returnval){ [ 895.553265] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52480631-c12c-f563-abc2-51e43afb2736" [ 895.553265] env[65680]: _type = "Task" [ 895.553265] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 895.560669] env[65680]: DEBUG oslo_vmware.api [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52480631-c12c-f563-abc2-51e43afb2736, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 896.063607] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 896.063877] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 896.064034] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 896.804958] env[65680]: DEBUG nova.compute.manager [req-3918f9b1-8b45-4736-bdf0-a06b1609e34e req-e2b48401-ab57-438a-b10c-c36f99e0fc1c service nova] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Received event network-changed-93599a65-fa65-48cd-80fe-0d734baa05b1 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 896.805057] env[65680]: DEBUG nova.compute.manager [req-3918f9b1-8b45-4736-bdf0-a06b1609e34e req-e2b48401-ab57-438a-b10c-c36f99e0fc1c service nova] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Refreshing instance network info cache due to event network-changed-93599a65-fa65-48cd-80fe-0d734baa05b1. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 896.805274] env[65680]: DEBUG oslo_concurrency.lockutils [req-3918f9b1-8b45-4736-bdf0-a06b1609e34e req-e2b48401-ab57-438a-b10c-c36f99e0fc1c service nova] Acquiring lock "refresh_cache-abb69e61-9594-48b5-b3f4-f8ba39f93f0e" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 896.805415] env[65680]: DEBUG oslo_concurrency.lockutils [req-3918f9b1-8b45-4736-bdf0-a06b1609e34e req-e2b48401-ab57-438a-b10c-c36f99e0fc1c service nova] Acquired lock "refresh_cache-abb69e61-9594-48b5-b3f4-f8ba39f93f0e" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 896.805573] env[65680]: DEBUG nova.network.neutron [req-3918f9b1-8b45-4736-bdf0-a06b1609e34e req-e2b48401-ab57-438a-b10c-c36f99e0fc1c service nova] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Refreshing network info cache for port 93599a65-fa65-48cd-80fe-0d734baa05b1 {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 897.302362] env[65680]: DEBUG nova.network.neutron [req-3918f9b1-8b45-4736-bdf0-a06b1609e34e req-e2b48401-ab57-438a-b10c-c36f99e0fc1c service nova] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Updated VIF entry in instance network info cache for port 93599a65-fa65-48cd-80fe-0d734baa05b1. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 897.302714] env[65680]: DEBUG nova.network.neutron [req-3918f9b1-8b45-4736-bdf0-a06b1609e34e req-e2b48401-ab57-438a-b10c-c36f99e0fc1c service nova] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Updating instance_info_cache with network_info: [{"id": "93599a65-fa65-48cd-80fe-0d734baa05b1", "address": "fa:16:3e:dc:26:8c", "network": {"id": "88155a8f-de41-4a3d-8802-3a6c5765e431", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-792625309-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ec0d6e13ecf4b72b79052a4077a754f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a06a63d6-2aeb-4084-8022-f804cac3fa74", "external-id": "nsx-vlan-transportzone-797", "segmentation_id": 797, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap93599a65-fa", "ovs_interfaceid": "93599a65-fa65-48cd-80fe-0d734baa05b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 897.313860] env[65680]: DEBUG oslo_concurrency.lockutils [req-3918f9b1-8b45-4736-bdf0-a06b1609e34e req-e2b48401-ab57-438a-b10c-c36f99e0fc1c service nova] Releasing lock "refresh_cache-abb69e61-9594-48b5-b3f4-f8ba39f93f0e" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 924.294643] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 924.295031] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Starting heal instance info cache {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 924.295031] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Rebuilding the list of instances to heal {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 924.307875] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 924.308035] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 924.308169] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 924.308295] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Didn't find any instances for network info cache update. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 925.293341] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 925.293582] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 928.293554] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 928.293810] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 928.293969] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 928.294127] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65680) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 929.294082] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 930.293358] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager.update_available_resource {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 930.303048] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 930.303398] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 930.303534] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 930.303589] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65680) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 930.304716] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-170c3598-efb6-4838-9639-d0b2d7cbca0c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.313679] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03893ae8-59a8-4b4f-9c62-c75f27e8b17d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.327220] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3c24771-6e4d-4c9f-ba4b-70f7b0890703 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.333313] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0aedef71-4c12-46a8-92b1-a6057c77d7d6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.361575] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181059MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65680) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 930.361722] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 930.361985] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 930.412561] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance acbe2170-7ce3-4820-b082-6680e559bde1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 930.412722] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance e5d6d263-463e-46b8-9bb3-d10a4101d4e0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 930.412884] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance abb69e61-9594-48b5-b3f4-f8ba39f93f0e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 930.426688] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 05ef6eca-eb64-43b3-8c7d-b5a230282a8f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 930.436860] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 8b747838-fcd0-494c-bd5a-0e5b1950a44e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 930.451028] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 01e82211-1de5-44ad-b14e-81a54470d4e5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 930.460516] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance dd382edd-abe8-4764-a9d5-4144ef7d50b0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 930.469863] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance c9230f1c-72ea-4f62-be9f-949def49c5f4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 930.479953] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 132e6039-55dc-4118-bcd5-d32557743981 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 930.480219] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Total usable vcpus: 48, total allocated vcpus: 3 {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 930.480368] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=896MB phys_disk=200GB used_disk=3GB total_vcpus=48 used_vcpus=3 pci_stats=[] {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 930.580302] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92fb3a84-ecba-4c38-9577-0bcd6ba053f4 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.587877] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d66bbee0-ba24-4809-a43a-fa6bf895d24d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.616590] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-752ad1af-5545-4d29-b9e4-0ae80d4859e1 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.623032] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f077011-3c26-4c59-bfa8-f1fa7df06539 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.635543] env[65680]: DEBUG nova.compute.provider_tree [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 930.644550] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 930.657855] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65680) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 930.658041] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.296s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 931.658152] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 941.718532] env[65680]: WARNING oslo_vmware.rw_handles [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 941.718532] env[65680]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 941.718532] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 941.718532] env[65680]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 941.718532] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 941.718532] env[65680]: ERROR oslo_vmware.rw_handles response.begin() [ 941.718532] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 941.718532] env[65680]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 941.718532] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 941.718532] env[65680]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 941.718532] env[65680]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 941.718532] env[65680]: ERROR oslo_vmware.rw_handles [ 941.719206] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Downloaded image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to vmware_temp/812b11fb-b4ad-4c97-bec4-717da09071a0/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 941.720764] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Caching image {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 941.721069] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Copying Virtual Disk [datastore1] vmware_temp/812b11fb-b4ad-4c97-bec4-717da09071a0/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk to [datastore1] vmware_temp/812b11fb-b4ad-4c97-bec4-717da09071a0/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk {{(pid=65680) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 941.721343] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a7d7c6f4-699f-4af6-bb9b-f7f502b45cc9 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 941.730252] env[65680]: DEBUG oslo_vmware.api [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Waiting for the task: (returnval){ [ 941.730252] env[65680]: value = "task-2847917" [ 941.730252] env[65680]: _type = "Task" [ 941.730252] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 941.738175] env[65680]: DEBUG oslo_vmware.api [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Task: {'id': task-2847917, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 942.241070] env[65680]: DEBUG oslo_vmware.exceptions [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Fault InvalidArgument not matched. {{(pid=65680) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 942.241285] env[65680]: DEBUG oslo_concurrency.lockutils [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 942.241844] env[65680]: ERROR nova.compute.manager [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 942.241844] env[65680]: Faults: ['InvalidArgument'] [ 942.241844] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Traceback (most recent call last): [ 942.241844] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 942.241844] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] yield resources [ 942.241844] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 942.241844] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] self.driver.spawn(context, instance, image_meta, [ 942.241844] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 942.241844] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 942.241844] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 942.241844] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] self._fetch_image_if_missing(context, vi) [ 942.241844] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 942.242170] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] image_cache(vi, tmp_image_ds_loc) [ 942.242170] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 942.242170] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] vm_util.copy_virtual_disk( [ 942.242170] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 942.242170] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] session._wait_for_task(vmdk_copy_task) [ 942.242170] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 942.242170] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] return self.wait_for_task(task_ref) [ 942.242170] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 942.242170] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] return evt.wait() [ 942.242170] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 942.242170] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] result = hub.switch() [ 942.242170] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 942.242170] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] return self.greenlet.switch() [ 942.242478] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 942.242478] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] self.f(*self.args, **self.kw) [ 942.242478] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 942.242478] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] raise exceptions.translate_fault(task_info.error) [ 942.242478] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 942.242478] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Faults: ['InvalidArgument'] [ 942.242478] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] [ 942.242478] env[65680]: INFO nova.compute.manager [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Terminating instance [ 942.243757] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 942.244323] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 942.244323] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-83a6dce5-316b-4b03-ab86-7fe735473c8f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 942.246517] env[65680]: DEBUG nova.compute.manager [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 942.246708] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 942.247408] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c3366fb-53dc-45a8-9d0f-629ddfc47321 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 942.253767] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 942.253996] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ad46dc37-d063-462a-b32b-541adc5b5b05 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 942.256172] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 942.256347] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 942.257255] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-87233646-485a-4d96-a496-0d457585cae1 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 942.263111] env[65680]: DEBUG oslo_vmware.api [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Waiting for the task: (returnval){ [ 942.263111] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]520816e5-2c7a-2a4a-1d80-73b6cb27ea53" [ 942.263111] env[65680]: _type = "Task" [ 942.263111] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 942.272788] env[65680]: DEBUG oslo_vmware.api [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]520816e5-2c7a-2a4a-1d80-73b6cb27ea53, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 942.322371] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 942.322599] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 942.322836] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Deleting the datastore file [datastore1] acbe2170-7ce3-4820-b082-6680e559bde1 {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 942.323193] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9ff573ef-952c-43a2-85db-d5d531e93ae4 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 942.328850] env[65680]: DEBUG oslo_vmware.api [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Waiting for the task: (returnval){ [ 942.328850] env[65680]: value = "task-2847919" [ 942.328850] env[65680]: _type = "Task" [ 942.328850] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 942.336284] env[65680]: DEBUG oslo_vmware.api [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Task: {'id': task-2847919, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 942.773259] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 942.773606] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Creating directory with path [datastore1] vmware_temp/643574c8-782f-4769-abf4-4408f68cf88b/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 942.773770] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1879bd7b-1788-4fcd-92bf-4797d1ae3eb1 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 942.784783] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Created directory with path [datastore1] vmware_temp/643574c8-782f-4769-abf4-4408f68cf88b/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 942.784981] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Fetch image to [datastore1] vmware_temp/643574c8-782f-4769-abf4-4408f68cf88b/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 942.785161] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/643574c8-782f-4769-abf4-4408f68cf88b/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 942.785895] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dba33f2c-f9d6-4142-b5a3-daf52a91116f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 942.792344] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26587a05-a3d7-4704-a0fe-4df70545b829 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 942.801246] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fa16a6c-f964-4ce7-a9cb-cdc4f87b19a4 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 942.834344] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5543e5f-ab1a-4ebe-94e3-0041bb713802 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 942.840916] env[65680]: DEBUG oslo_vmware.api [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Task: {'id': task-2847919, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077634} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 942.842267] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 942.842456] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 942.842625] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 942.842795] env[65680]: INFO nova.compute.manager [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Took 0.60 seconds to destroy the instance on the hypervisor. [ 942.844525] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ba763ba0-bc7d-45c4-a0f7-d0752b061ce2 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 942.846374] env[65680]: DEBUG nova.compute.claims [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 942.846541] env[65680]: DEBUG oslo_concurrency.lockutils [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 942.846749] env[65680]: DEBUG oslo_concurrency.lockutils [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 942.869883] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 942.987567] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a7ef482-e64c-4ed9-8113-2734149b9ddb {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 942.994976] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cff39b14-ef70-4cad-8667-c6f2d06a05dc {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 943.025676] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-981df01a-c6b2-4b64-a30e-4ccab2daffa1 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 943.033144] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-217aa9ec-7005-4eb6-822c-508c087a120f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 943.046047] env[65680]: DEBUG nova.compute.provider_tree [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 943.056798] env[65680]: DEBUG nova.scheduler.client.report [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 943.070495] env[65680]: DEBUG oslo_concurrency.lockutils [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.224s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 943.071089] env[65680]: ERROR nova.compute.manager [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 943.071089] env[65680]: Faults: ['InvalidArgument'] [ 943.071089] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Traceback (most recent call last): [ 943.071089] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 943.071089] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] self.driver.spawn(context, instance, image_meta, [ 943.071089] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 943.071089] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 943.071089] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 943.071089] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] self._fetch_image_if_missing(context, vi) [ 943.071089] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 943.071089] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] image_cache(vi, tmp_image_ds_loc) [ 943.071089] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 943.071382] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] vm_util.copy_virtual_disk( [ 943.071382] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 943.071382] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] session._wait_for_task(vmdk_copy_task) [ 943.071382] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 943.071382] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] return self.wait_for_task(task_ref) [ 943.071382] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 943.071382] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] return evt.wait() [ 943.071382] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 943.071382] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] result = hub.switch() [ 943.071382] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 943.071382] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] return self.greenlet.switch() [ 943.071382] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 943.071382] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] self.f(*self.args, **self.kw) [ 943.071651] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 943.071651] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] raise exceptions.translate_fault(task_info.error) [ 943.071651] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 943.071651] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Faults: ['InvalidArgument'] [ 943.071651] env[65680]: ERROR nova.compute.manager [instance: acbe2170-7ce3-4820-b082-6680e559bde1] [ 943.071890] env[65680]: DEBUG nova.compute.utils [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] VimFaultException {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 943.073341] env[65680]: DEBUG nova.compute.manager [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Build of instance acbe2170-7ce3-4820-b082-6680e559bde1 was re-scheduled: A specified parameter was not correct: fileType [ 943.073341] env[65680]: Faults: ['InvalidArgument'] {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 943.073716] env[65680]: DEBUG nova.compute.manager [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 943.073896] env[65680]: DEBUG nova.compute.manager [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 943.074216] env[65680]: DEBUG nova.compute.manager [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 943.074295] env[65680]: DEBUG nova.network.neutron [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 943.081508] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 943.084021] env[65680]: ERROR nova.compute.manager [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 943.084021] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Traceback (most recent call last): [ 943.084021] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 943.084021] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 943.084021] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 943.084021] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] result = getattr(controller, method)(*args, **kwargs) [ 943.084021] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 943.084021] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return self._get(image_id) [ 943.084021] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 943.084021] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return RequestIdProxy(wrapped(*args, **kwargs)) [ 943.084021] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 943.084395] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] resp, body = self.http_client.get(url, headers=header) [ 943.084395] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 943.084395] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return self.request(url, 'GET', **kwargs) [ 943.084395] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 943.084395] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return self._handle_response(resp) [ 943.084395] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 943.084395] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] raise exc.from_response(resp, resp.content) [ 943.084395] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 943.084395] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 943.084395] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] During handling of the above exception, another exception occurred: [ 943.084395] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 943.084395] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Traceback (most recent call last): [ 943.084662] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 943.084662] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] yield resources [ 943.084662] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 943.084662] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] self.driver.spawn(context, instance, image_meta, [ 943.084662] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 943.084662] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 943.084662] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 943.084662] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] self._fetch_image_if_missing(context, vi) [ 943.084662] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 943.084662] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] image_fetch(context, vi, tmp_image_ds_loc) [ 943.084662] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 943.084662] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] images.fetch_image( [ 943.084662] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 943.084948] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] metadata = IMAGE_API.get(context, image_ref) [ 943.084948] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 943.084948] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return session.show(context, image_id, [ 943.084948] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 943.084948] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] _reraise_translated_image_exception(image_id) [ 943.084948] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 943.084948] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] raise new_exc.with_traceback(exc_trace) [ 943.084948] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 943.084948] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 943.084948] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 943.084948] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] result = getattr(controller, method)(*args, **kwargs) [ 943.084948] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 943.084948] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return self._get(image_id) [ 943.085239] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 943.085239] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return RequestIdProxy(wrapped(*args, **kwargs)) [ 943.085239] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 943.085239] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] resp, body = self.http_client.get(url, headers=header) [ 943.085239] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 943.085239] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return self.request(url, 'GET', **kwargs) [ 943.085239] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 943.085239] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return self._handle_response(resp) [ 943.085239] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 943.085239] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] raise exc.from_response(resp, resp.content) [ 943.085239] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 943.085239] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 943.085491] env[65680]: INFO nova.compute.manager [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Terminating instance [ 943.085491] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 943.085491] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 943.085491] env[65680]: DEBUG nova.compute.manager [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 943.085595] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 943.085809] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-757d5832-6e03-460d-a255-ddce659c0569 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 943.088648] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7b49750-7277-49b4-8509-6d422d1a8a5e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 943.097066] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 943.098045] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-46db7159-ca79-4187-93ea-e0a1a42fc97b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 943.099416] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 943.099587] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 943.100240] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-aea61dd1-dda6-44c7-a38f-ddd32edbddc0 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 943.105137] env[65680]: DEBUG oslo_vmware.api [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Waiting for the task: (returnval){ [ 943.105137] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]526459e2-d961-6fc5-40cb-7e3488c6a67d" [ 943.105137] env[65680]: _type = "Task" [ 943.105137] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 943.114947] env[65680]: DEBUG oslo_vmware.api [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]526459e2-d961-6fc5-40cb-7e3488c6a67d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 943.170242] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 943.170473] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 943.170790] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Deleting the datastore file [datastore1] f05204a0-268f-4d77-a2bf-cde4ee02915e {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 943.171130] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1c88107c-3354-476e-9fcd-02a2c37aa98d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 943.177242] env[65680]: DEBUG oslo_vmware.api [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Waiting for the task: (returnval){ [ 943.177242] env[65680]: value = "task-2847921" [ 943.177242] env[65680]: _type = "Task" [ 943.177242] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 943.185122] env[65680]: DEBUG oslo_vmware.api [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Task: {'id': task-2847921, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 943.333156] env[65680]: DEBUG nova.network.neutron [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 943.342912] env[65680]: INFO nova.compute.manager [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Took 0.27 seconds to deallocate network for instance. [ 943.425405] env[65680]: INFO nova.scheduler.client.report [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Deleted allocations for instance acbe2170-7ce3-4820-b082-6680e559bde1 [ 943.441768] env[65680]: DEBUG oslo_concurrency.lockutils [None req-42c1e54f-9223-4d23-a242-ccc3e00d4c31 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Lock "acbe2170-7ce3-4820-b082-6680e559bde1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 374.266s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 943.443228] env[65680]: DEBUG oslo_concurrency.lockutils [None req-d11d922d-45f5-4e89-b58a-78cfb2788a57 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Lock "acbe2170-7ce3-4820-b082-6680e559bde1" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 177.615s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 943.443515] env[65680]: DEBUG oslo_concurrency.lockutils [None req-d11d922d-45f5-4e89-b58a-78cfb2788a57 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Acquiring lock "acbe2170-7ce3-4820-b082-6680e559bde1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 943.443784] env[65680]: DEBUG oslo_concurrency.lockutils [None req-d11d922d-45f5-4e89-b58a-78cfb2788a57 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Lock "acbe2170-7ce3-4820-b082-6680e559bde1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 943.444090] env[65680]: DEBUG oslo_concurrency.lockutils [None req-d11d922d-45f5-4e89-b58a-78cfb2788a57 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Lock "acbe2170-7ce3-4820-b082-6680e559bde1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 943.446618] env[65680]: INFO nova.compute.manager [None req-d11d922d-45f5-4e89-b58a-78cfb2788a57 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Terminating instance [ 943.454629] env[65680]: DEBUG nova.compute.manager [None req-d11d922d-45f5-4e89-b58a-78cfb2788a57 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 943.454629] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-d11d922d-45f5-4e89-b58a-78cfb2788a57 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 943.454811] env[65680]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-172b1370-efd7-4626-96f3-cb38b5c7c4ea {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 943.462804] env[65680]: DEBUG nova.compute.manager [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 943.469947] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00bd7605-d7cc-4c23-8dda-5b5053dd0178 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 943.503987] env[65680]: WARNING nova.virt.vmwareapi.vmops [None req-d11d922d-45f5-4e89-b58a-78cfb2788a57 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance acbe2170-7ce3-4820-b082-6680e559bde1 could not be found. [ 943.504224] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-d11d922d-45f5-4e89-b58a-78cfb2788a57 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 943.504398] env[65680]: INFO nova.compute.manager [None req-d11d922d-45f5-4e89-b58a-78cfb2788a57 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Took 0.05 seconds to destroy the instance on the hypervisor. [ 943.504636] env[65680]: DEBUG oslo.service.loopingcall [None req-d11d922d-45f5-4e89-b58a-78cfb2788a57 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 943.506934] env[65680]: DEBUG nova.compute.manager [-] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 943.507046] env[65680]: DEBUG nova.network.neutron [-] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 943.520807] env[65680]: DEBUG oslo_concurrency.lockutils [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 943.521084] env[65680]: DEBUG oslo_concurrency.lockutils [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 943.522575] env[65680]: INFO nova.compute.claims [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 943.617130] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 943.617130] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Creating directory with path [datastore1] vmware_temp/f3acbd4e-e688-4ad0-8b45-a1787ec86633/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 943.617130] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-833f183c-0a95-4584-a3c8-7d94be439965 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 943.631321] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Created directory with path [datastore1] vmware_temp/f3acbd4e-e688-4ad0-8b45-a1787ec86633/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 943.631321] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Fetch image to [datastore1] vmware_temp/f3acbd4e-e688-4ad0-8b45-a1787ec86633/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 943.631321] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/f3acbd4e-e688-4ad0-8b45-a1787ec86633/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 943.631321] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12624196-8034-4bd4-ba94-a00fbea1264a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 943.646313] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-312f9611-151b-4862-ad98-873a49280559 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 943.658285] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3cff4b9-db0c-41eb-b78a-ad9820c98f75 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 943.697568] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a7f958f-1e7f-4edb-bcde-1bc965b32d91 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 943.700570] env[65680]: DEBUG nova.network.neutron [-] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 943.706920] env[65680]: DEBUG oslo_vmware.api [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Task: {'id': task-2847921, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072068} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 943.708820] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 943.708820] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 943.708820] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 943.708961] env[65680]: INFO nova.compute.manager [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Took 0.62 seconds to destroy the instance on the hypervisor. [ 943.711716] env[65680]: DEBUG nova.compute.claims [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 943.711716] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 943.711716] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0c0261c5-cd5e-4ee4-925e-c8157870c169 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 943.713432] env[65680]: INFO nova.compute.manager [-] [instance: acbe2170-7ce3-4820-b082-6680e559bde1] Took 0.21 seconds to deallocate network for instance. [ 943.735845] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 943.744142] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3155859-91ab-458e-8f66-71b298e8ac92 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 943.754586] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a9d2b94-b571-47d0-95af-895aaafb3be9 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 943.785637] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e645650-6459-44a2-b5e9-bd6b48b03bca {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 943.794871] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef30a103-a63a-4a32-952e-f1b0f0c95e0b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 943.807616] env[65680]: DEBUG nova.compute.provider_tree [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 943.817191] env[65680]: DEBUG nova.scheduler.client.report [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 943.830689] env[65680]: DEBUG oslo_concurrency.lockutils [None req-d11d922d-45f5-4e89-b58a-78cfb2788a57 tempest-ServerDiagnosticsNegativeTest-749488961 tempest-ServerDiagnosticsNegativeTest-749488961-project-member] Lock "acbe2170-7ce3-4820-b082-6680e559bde1" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.388s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 943.832255] env[65680]: DEBUG oslo_concurrency.lockutils [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.311s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 943.832712] env[65680]: DEBUG nova.compute.manager [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 943.835119] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.124s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 943.862394] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.027s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 943.862572] env[65680]: DEBUG nova.compute.utils [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Instance f05204a0-268f-4d77-a2bf-cde4ee02915e could not be found. {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 943.864016] env[65680]: DEBUG nova.compute.manager [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Instance disappeared during build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 943.864204] env[65680]: DEBUG nova.compute.manager [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 943.864361] env[65680]: DEBUG nova.compute.manager [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 943.864523] env[65680]: DEBUG nova.compute.manager [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 943.864675] env[65680]: DEBUG nova.network.neutron [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 943.868857] env[65680]: DEBUG nova.compute.utils [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 943.870332] env[65680]: DEBUG nova.compute.manager [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 943.871937] env[65680]: DEBUG nova.network.neutron [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 943.877796] env[65680]: DEBUG nova.compute.manager [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 943.943829] env[65680]: DEBUG nova.compute.manager [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 944.011447] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 944.012377] env[65680]: ERROR nova.compute.manager [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 944.012377] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Traceback (most recent call last): [ 944.012377] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 944.012377] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 944.012377] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 944.012377] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] result = getattr(controller, method)(*args, **kwargs) [ 944.012377] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 944.012377] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return self._get(image_id) [ 944.012377] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 944.012377] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return RequestIdProxy(wrapped(*args, **kwargs)) [ 944.012377] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 944.012797] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] resp, body = self.http_client.get(url, headers=header) [ 944.012797] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 944.012797] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return self.request(url, 'GET', **kwargs) [ 944.012797] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 944.012797] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return self._handle_response(resp) [ 944.012797] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 944.012797] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] raise exc.from_response(resp, resp.content) [ 944.012797] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 944.012797] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.012797] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] During handling of the above exception, another exception occurred: [ 944.012797] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.012797] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Traceback (most recent call last): [ 944.013425] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 944.013425] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] yield resources [ 944.013425] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 944.013425] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] self.driver.spawn(context, instance, image_meta, [ 944.013425] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 944.013425] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 944.013425] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 944.013425] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] self._fetch_image_if_missing(context, vi) [ 944.013425] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 944.013425] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] image_fetch(context, vi, tmp_image_ds_loc) [ 944.013425] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 944.013425] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] images.fetch_image( [ 944.013425] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 944.013827] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] metadata = IMAGE_API.get(context, image_ref) [ 944.013827] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 944.013827] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return session.show(context, image_id, [ 944.013827] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 944.013827] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] _reraise_translated_image_exception(image_id) [ 944.013827] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 944.013827] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] raise new_exc.with_traceback(exc_trace) [ 944.013827] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 944.013827] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 944.013827] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 944.013827] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] result = getattr(controller, method)(*args, **kwargs) [ 944.013827] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 944.013827] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return self._get(image_id) [ 944.014247] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 944.014247] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return RequestIdProxy(wrapped(*args, **kwargs)) [ 944.014247] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 944.014247] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] resp, body = self.http_client.get(url, headers=header) [ 944.014247] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 944.014247] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return self.request(url, 'GET', **kwargs) [ 944.014247] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 944.014247] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return self._handle_response(resp) [ 944.014247] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 944.014247] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] raise exc.from_response(resp, resp.content) [ 944.014247] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 944.014247] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.014644] env[65680]: INFO nova.compute.manager [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Terminating instance [ 944.014644] env[65680]: DEBUG oslo_concurrency.lockutils [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 944.014644] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 944.016688] env[65680]: DEBUG nova.virt.hardware [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 944.016910] env[65680]: DEBUG nova.virt.hardware [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 944.017081] env[65680]: DEBUG nova.virt.hardware [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 944.017274] env[65680]: DEBUG nova.virt.hardware [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 944.017420] env[65680]: DEBUG nova.virt.hardware [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 944.017573] env[65680]: DEBUG nova.virt.hardware [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 944.017794] env[65680]: DEBUG nova.virt.hardware [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 944.017955] env[65680]: DEBUG nova.virt.hardware [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 944.018139] env[65680]: DEBUG nova.virt.hardware [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 944.018303] env[65680]: DEBUG nova.virt.hardware [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 944.018484] env[65680]: DEBUG nova.virt.hardware [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 944.019106] env[65680]: DEBUG nova.compute.manager [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 944.019296] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 944.020947] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3fef4a51-3524-42e3-b365-cfb3edb0d18e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.023360] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5df80745-c4ab-4d8c-813e-5daf5270eb51 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.026407] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53a6bb9c-b638-4219-a32b-b222edf844aa {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.030451] env[65680]: DEBUG nova.policy [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '76e0a85be35f45538483c22f0b7d2202', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f89b77f14b3b46e58e32fbb9c68c9ca5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 944.038878] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0aa9e05e-af9c-4cb8-ab39-8daab7122480 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.043999] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 944.044213] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 944.045171] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e0aa3974-7f66-4d63-b6fe-57d9f5961eb4 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.056627] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 944.058382] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-60b485b9-4fbc-41e4-8d11-279606befcfa {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.061804] env[65680]: DEBUG oslo_vmware.api [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Waiting for the task: (returnval){ [ 944.061804] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52af4b07-685f-af90-6452-b6413f0b5071" [ 944.061804] env[65680]: _type = "Task" [ 944.061804] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 944.070755] env[65680]: DEBUG oslo_vmware.api [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52af4b07-685f-af90-6452-b6413f0b5071, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 944.125750] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 944.125950] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 944.126402] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Deleting the datastore file [datastore1] f989cbee-9d5c-459f-b7a0-bf2259dadbb0 {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 944.126961] env[65680]: DEBUG neutronclient.v2_0.client [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=65680) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 944.130768] env[65680]: ERROR nova.compute.manager [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 944.130768] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Traceback (most recent call last): [ 944.130768] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 944.130768] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 944.130768] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 944.130768] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] result = getattr(controller, method)(*args, **kwargs) [ 944.130768] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 944.130768] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return self._get(image_id) [ 944.130768] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 944.130768] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return RequestIdProxy(wrapped(*args, **kwargs)) [ 944.130768] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 944.130768] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] resp, body = self.http_client.get(url, headers=header) [ 944.132572] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 944.132572] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return self.request(url, 'GET', **kwargs) [ 944.132572] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 944.132572] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return self._handle_response(resp) [ 944.132572] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 944.132572] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] raise exc.from_response(resp, resp.content) [ 944.132572] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 944.132572] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.132572] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] During handling of the above exception, another exception occurred: [ 944.132572] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.132572] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Traceback (most recent call last): [ 944.132572] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 944.133210] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] self.driver.spawn(context, instance, image_meta, [ 944.133210] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 944.133210] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 944.133210] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 944.133210] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] self._fetch_image_if_missing(context, vi) [ 944.133210] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 944.133210] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] image_fetch(context, vi, tmp_image_ds_loc) [ 944.133210] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 944.133210] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] images.fetch_image( [ 944.133210] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 944.133210] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] metadata = IMAGE_API.get(context, image_ref) [ 944.133210] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 944.133210] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return session.show(context, image_id, [ 944.134378] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 944.134378] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] _reraise_translated_image_exception(image_id) [ 944.134378] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 944.134378] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] raise new_exc.with_traceback(exc_trace) [ 944.134378] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 944.134378] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 944.134378] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 944.134378] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] result = getattr(controller, method)(*args, **kwargs) [ 944.134378] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 944.134378] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return self._get(image_id) [ 944.134378] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 944.134378] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return RequestIdProxy(wrapped(*args, **kwargs)) [ 944.134378] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 944.135800] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] resp, body = self.http_client.get(url, headers=header) [ 944.135800] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 944.135800] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return self.request(url, 'GET', **kwargs) [ 944.135800] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 944.135800] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return self._handle_response(resp) [ 944.135800] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 944.135800] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] raise exc.from_response(resp, resp.content) [ 944.135800] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 944.135800] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.135800] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] During handling of the above exception, another exception occurred: [ 944.135800] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.135800] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Traceback (most recent call last): [ 944.135800] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 944.136270] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] self._build_and_run_instance(context, instance, image, [ 944.136270] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 944.136270] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] with excutils.save_and_reraise_exception(): [ 944.136270] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 944.136270] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] self.force_reraise() [ 944.136270] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 944.136270] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] raise self.value [ 944.136270] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 944.136270] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] with self.rt.instance_claim(context, instance, node, allocs, [ 944.136270] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 944.136270] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] self.abort() [ 944.136270] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 944.136270] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 944.136748] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 944.136748] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return f(*args, **kwargs) [ 944.136748] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 944.136748] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] self._unset_instance_host_and_node(instance) [ 944.136748] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 944.136748] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] instance.save() [ 944.136748] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 944.136748] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] updates, result = self.indirection_api.object_action( [ 944.136748] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 944.136748] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return cctxt.call(context, 'object_action', objinst=objinst, [ 944.136748] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 944.136748] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] result = self.transport._send( [ 944.137329] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 944.137329] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return self._driver.send(target, ctxt, message, [ 944.137329] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 944.137329] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 944.137329] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 944.137329] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] raise result [ 944.137329] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] nova.exception_Remote.InstanceNotFound_Remote: Instance f05204a0-268f-4d77-a2bf-cde4ee02915e could not be found. [ 944.137329] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Traceback (most recent call last): [ 944.137329] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.137329] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 944.137329] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return getattr(target, method)(*args, **kwargs) [ 944.137329] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.137329] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 944.137779] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return fn(self, *args, **kwargs) [ 944.137779] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.137779] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 944.137779] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] old_ref, inst_ref = db.instance_update_and_get_original( [ 944.137779] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.137779] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 944.137779] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return f(*args, **kwargs) [ 944.137779] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.137779] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 944.137779] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] with excutils.save_and_reraise_exception() as ectxt: [ 944.137779] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.137779] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 944.137779] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] self.force_reraise() [ 944.137779] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.137779] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 944.138283] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] raise self.value [ 944.138283] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.138283] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 944.138283] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return f(*args, **kwargs) [ 944.138283] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.138283] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 944.138283] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return f(context, *args, **kwargs) [ 944.138283] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.138283] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 944.138283] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 944.138283] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.138283] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 944.138283] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] raise exception.InstanceNotFound(instance_id=uuid) [ 944.138283] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.138283] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] nova.exception.InstanceNotFound: Instance f05204a0-268f-4d77-a2bf-cde4ee02915e could not be found. [ 944.138738] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.138738] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.138738] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] During handling of the above exception, another exception occurred: [ 944.138738] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.138738] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Traceback (most recent call last): [ 944.138738] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 944.138738] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] ret = obj(*args, **kwargs) [ 944.138738] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 944.138738] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] exception_handler_v20(status_code, error_body) [ 944.138738] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 944.138738] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] raise client_exc(message=error_message, [ 944.138738] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 944.138738] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Neutron server returns request_ids: ['req-92f3c933-6f89-43a1-8fd7-4b314c602602'] [ 944.138738] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.139240] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] During handling of the above exception, another exception occurred: [ 944.139240] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.139240] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Traceback (most recent call last): [ 944.139240] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 944.139240] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] self._deallocate_network(context, instance, requested_networks) [ 944.139240] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 944.139240] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] self.network_api.deallocate_for_instance( [ 944.139240] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 944.139240] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] data = neutron.list_ports(**search_opts) [ 944.139240] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 944.139240] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] ret = obj(*args, **kwargs) [ 944.139240] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 944.139240] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return self.list('ports', self.ports_path, retrieve_all, [ 944.139647] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 944.139647] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] ret = obj(*args, **kwargs) [ 944.139647] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 944.139647] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] for r in self._pagination(collection, path, **params): [ 944.139647] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 944.139647] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] res = self.get(path, params=params) [ 944.139647] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 944.139647] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] ret = obj(*args, **kwargs) [ 944.139647] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 944.139647] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return self.retry_request("GET", action, body=body, [ 944.139647] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 944.139647] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] ret = obj(*args, **kwargs) [ 944.139647] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 944.139972] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] return self.do_request(method, action, body=body, [ 944.139972] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 944.139972] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] ret = obj(*args, **kwargs) [ 944.139972] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 944.139972] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] self._handle_fault_response(status_code, replybody, resp) [ 944.139972] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 944.139972] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] raise exception.Unauthorized() [ 944.139972] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] nova.exception.Unauthorized: Not authorized. [ 944.139972] env[65680]: ERROR nova.compute.manager [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] [ 944.139972] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2e50582c-7de7-4f13-ada6-d016792d1374 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.139972] env[65680]: DEBUG oslo_vmware.api [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Waiting for the task: (returnval){ [ 944.139972] env[65680]: value = "task-2847923" [ 944.139972] env[65680]: _type = "Task" [ 944.139972] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 944.148089] env[65680]: DEBUG oslo_vmware.api [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Task: {'id': task-2847923, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 944.153018] env[65680]: DEBUG oslo_concurrency.lockutils [None req-c35498d9-7a04-4ffe-88d8-4f833e3b16ce tempest-DeleteServersTestJSON-396151702 tempest-DeleteServersTestJSON-396151702-project-member] Lock "f05204a0-268f-4d77-a2bf-cde4ee02915e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 312.422s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 944.164099] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 944.214282] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 944.214536] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 944.216033] env[65680]: INFO nova.compute.claims [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 944.372045] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b979792-4862-4c07-9a69-2e24d0e22625 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.379761] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92d8bdbc-ef63-432a-b3af-ef37b9535a27 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.410276] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b25e2be7-77aa-42be-85b1-d51898899d9c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.417633] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ef0f376-b9ea-417a-8685-79d966a68225 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.431133] env[65680]: DEBUG nova.compute.provider_tree [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 944.439662] env[65680]: DEBUG nova.scheduler.client.report [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 944.454215] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.239s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 944.454713] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 944.475377] env[65680]: DEBUG nova.network.neutron [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Successfully created port: c953efe3-8348-4fd0-a558-0913fd2880d2 {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 944.490478] env[65680]: DEBUG nova.compute.utils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 944.491946] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 944.492145] env[65680]: DEBUG nova.network.neutron [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 944.502709] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 944.574205] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 944.576337] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 944.576581] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Creating directory with path [datastore1] vmware_temp/ca91701c-351d-4802-ad86-8d5a9ff06d23/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 944.578264] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6776f43c-0b75-4f45-9a09-169ae91d3690 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.582528] env[65680]: DEBUG nova.policy [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '65370eccd0d14c8ca37ee4ab40141956', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bb30ad84896d480885a85e4621656c6a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 944.591155] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Created directory with path [datastore1] vmware_temp/ca91701c-351d-4802-ad86-8d5a9ff06d23/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 944.591155] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Fetch image to [datastore1] vmware_temp/ca91701c-351d-4802-ad86-8d5a9ff06d23/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 944.591155] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/ca91701c-351d-4802-ad86-8d5a9ff06d23/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 944.591155] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-284183e1-3033-4155-ad85-5c17cf125c6a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.599336] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-604f22eb-10f5-49c7-8c6b-87dfe2220315 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.604139] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 944.604381] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 944.604539] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 944.604718] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 944.604917] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 944.605090] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 944.605304] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 944.605461] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 944.605627] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 944.605786] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 944.605955] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 944.606689] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccd1d0df-7d89-47e8-8f35-f42889e6d848 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.617959] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f86bc3d0-49e1-477a-8c35-bcb083ee89dc {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.622743] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb12b728-f720-45fd-bcf5-3e958ba22902 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.662926] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42cc8655-b95c-4484-940a-0f185203862d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.671180] env[65680]: DEBUG oslo_vmware.api [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Task: {'id': task-2847923, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.082473} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 944.672631] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 944.672892] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 944.673011] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 944.673205] env[65680]: INFO nova.compute.manager [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Took 0.65 seconds to destroy the instance on the hypervisor. [ 944.675237] env[65680]: DEBUG nova.compute.claims [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 944.675410] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 944.675618] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 944.678030] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-dd774704-46b3-4892-96e9-9fed308cd812 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.700058] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 944.704457] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.029s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 944.705151] env[65680]: DEBUG nova.compute.utils [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Instance f989cbee-9d5c-459f-b7a0-bf2259dadbb0 could not be found. {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 944.706684] env[65680]: DEBUG nova.compute.manager [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Instance disappeared during build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 944.706761] env[65680]: DEBUG nova.compute.manager [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 944.707929] env[65680]: DEBUG nova.compute.manager [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 944.707929] env[65680]: DEBUG nova.compute.manager [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 944.707929] env[65680]: DEBUG nova.network.neutron [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 944.781868] env[65680]: DEBUG oslo_concurrency.lockutils [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 944.782663] env[65680]: ERROR nova.compute.manager [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 944.782663] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Traceback (most recent call last): [ 944.782663] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 944.782663] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 944.782663] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 944.782663] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] result = getattr(controller, method)(*args, **kwargs) [ 944.782663] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 944.782663] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return self._get(image_id) [ 944.782663] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 944.782663] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return RequestIdProxy(wrapped(*args, **kwargs)) [ 944.782663] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 944.783222] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] resp, body = self.http_client.get(url, headers=header) [ 944.783222] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 944.783222] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return self.request(url, 'GET', **kwargs) [ 944.783222] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 944.783222] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return self._handle_response(resp) [ 944.783222] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 944.783222] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] raise exc.from_response(resp, resp.content) [ 944.783222] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 944.783222] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 944.783222] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] During handling of the above exception, another exception occurred: [ 944.783222] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 944.783222] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Traceback (most recent call last): [ 944.783693] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 944.783693] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] yield resources [ 944.783693] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 944.783693] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] self.driver.spawn(context, instance, image_meta, [ 944.783693] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 944.783693] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] self._vmops.spawn(context, instance, image_meta, injected_files, [ 944.783693] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 944.783693] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] self._fetch_image_if_missing(context, vi) [ 944.783693] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 944.783693] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] image_fetch(context, vi, tmp_image_ds_loc) [ 944.783693] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 944.783693] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] images.fetch_image( [ 944.783693] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 944.784383] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] metadata = IMAGE_API.get(context, image_ref) [ 944.784383] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 944.784383] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return session.show(context, image_id, [ 944.784383] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 944.784383] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] _reraise_translated_image_exception(image_id) [ 944.784383] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 944.784383] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] raise new_exc.with_traceback(exc_trace) [ 944.784383] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 944.784383] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 944.784383] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 944.784383] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] result = getattr(controller, method)(*args, **kwargs) [ 944.784383] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 944.784383] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return self._get(image_id) [ 944.784887] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 944.784887] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return RequestIdProxy(wrapped(*args, **kwargs)) [ 944.784887] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 944.784887] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] resp, body = self.http_client.get(url, headers=header) [ 944.784887] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 944.784887] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return self.request(url, 'GET', **kwargs) [ 944.784887] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 944.784887] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return self._handle_response(resp) [ 944.784887] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 944.784887] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] raise exc.from_response(resp, resp.content) [ 944.784887] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 944.784887] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 944.785372] env[65680]: INFO nova.compute.manager [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Terminating instance [ 944.785372] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 944.785372] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 944.785803] env[65680]: DEBUG nova.compute.manager [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 944.786128] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 944.786262] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-11b2b96a-d992-4dcf-8eb1-115608b763cc {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.790030] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f919ea4e-4d8e-4b57-980b-b676ad950190 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.798713] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 944.798988] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a4a7db02-2c78-4798-8704-d4e4001e6adc {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.801758] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 944.801990] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 944.802958] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b22f868f-a3ed-4f0e-9ced-af6f0bf79c9e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.808150] env[65680]: DEBUG oslo_vmware.api [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Waiting for the task: (returnval){ [ 944.808150] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]5254255e-2054-3349-0c88-2a514799c12e" [ 944.808150] env[65680]: _type = "Task" [ 944.808150] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 944.819158] env[65680]: DEBUG oslo_vmware.api [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]5254255e-2054-3349-0c88-2a514799c12e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 944.866196] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 944.866431] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 944.866614] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Deleting the datastore file [datastore1] 40a7ee3c-8627-47f3-887e-31112586e799 {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 944.867998] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a328b325-0243-4a64-831c-6aa5ab79acdf {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.874764] env[65680]: DEBUG oslo_vmware.api [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Waiting for the task: (returnval){ [ 944.874764] env[65680]: value = "task-2847925" [ 944.874764] env[65680]: _type = "Task" [ 944.874764] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 944.884829] env[65680]: DEBUG oslo_vmware.api [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Task: {'id': task-2847925, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 944.902776] env[65680]: DEBUG neutronclient.v2_0.client [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=65680) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 944.904414] env[65680]: ERROR nova.compute.manager [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 944.904414] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Traceback (most recent call last): [ 944.904414] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 944.904414] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 944.904414] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 944.904414] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] result = getattr(controller, method)(*args, **kwargs) [ 944.904414] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 944.904414] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return self._get(image_id) [ 944.904414] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 944.904414] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return RequestIdProxy(wrapped(*args, **kwargs)) [ 944.904414] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 944.904414] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] resp, body = self.http_client.get(url, headers=header) [ 944.904730] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 944.904730] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return self.request(url, 'GET', **kwargs) [ 944.904730] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 944.904730] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return self._handle_response(resp) [ 944.904730] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 944.904730] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] raise exc.from_response(resp, resp.content) [ 944.904730] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 944.904730] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.904730] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] During handling of the above exception, another exception occurred: [ 944.904730] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.904730] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Traceback (most recent call last): [ 944.904730] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 944.904990] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] self.driver.spawn(context, instance, image_meta, [ 944.904990] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 944.904990] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 944.904990] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 944.904990] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] self._fetch_image_if_missing(context, vi) [ 944.904990] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 944.904990] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] image_fetch(context, vi, tmp_image_ds_loc) [ 944.904990] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 944.904990] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] images.fetch_image( [ 944.904990] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 944.904990] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] metadata = IMAGE_API.get(context, image_ref) [ 944.904990] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 944.904990] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return session.show(context, image_id, [ 944.905345] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 944.905345] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] _reraise_translated_image_exception(image_id) [ 944.905345] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 944.905345] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] raise new_exc.with_traceback(exc_trace) [ 944.905345] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 944.905345] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 944.905345] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 944.905345] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] result = getattr(controller, method)(*args, **kwargs) [ 944.905345] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 944.905345] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return self._get(image_id) [ 944.905345] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 944.905345] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return RequestIdProxy(wrapped(*args, **kwargs)) [ 944.905345] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 944.905617] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] resp, body = self.http_client.get(url, headers=header) [ 944.905617] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 944.905617] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return self.request(url, 'GET', **kwargs) [ 944.905617] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 944.905617] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return self._handle_response(resp) [ 944.905617] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 944.905617] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] raise exc.from_response(resp, resp.content) [ 944.905617] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 944.905617] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.905617] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] During handling of the above exception, another exception occurred: [ 944.905617] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.905617] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Traceback (most recent call last): [ 944.905617] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 944.905892] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] self._build_and_run_instance(context, instance, image, [ 944.905892] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 944.905892] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] with excutils.save_and_reraise_exception(): [ 944.905892] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 944.905892] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] self.force_reraise() [ 944.905892] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 944.905892] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] raise self.value [ 944.905892] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 944.905892] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] with self.rt.instance_claim(context, instance, node, allocs, [ 944.905892] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 944.905892] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] self.abort() [ 944.905892] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 944.905892] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 944.906249] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 944.906249] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return f(*args, **kwargs) [ 944.906249] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 944.906249] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] self._unset_instance_host_and_node(instance) [ 944.906249] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 944.906249] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] instance.save() [ 944.906249] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 944.906249] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] updates, result = self.indirection_api.object_action( [ 944.906249] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 944.906249] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return cctxt.call(context, 'object_action', objinst=objinst, [ 944.906249] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 944.906249] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] result = self.transport._send( [ 944.906514] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 944.906514] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return self._driver.send(target, ctxt, message, [ 944.906514] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 944.906514] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 944.906514] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 944.906514] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] raise result [ 944.906514] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] nova.exception_Remote.InstanceNotFound_Remote: Instance f989cbee-9d5c-459f-b7a0-bf2259dadbb0 could not be found. [ 944.906514] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Traceback (most recent call last): [ 944.906514] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.906514] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 944.906514] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return getattr(target, method)(*args, **kwargs) [ 944.906514] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.906514] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 944.906799] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return fn(self, *args, **kwargs) [ 944.906799] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.906799] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 944.906799] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] old_ref, inst_ref = db.instance_update_and_get_original( [ 944.906799] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.906799] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 944.906799] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return f(*args, **kwargs) [ 944.906799] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.906799] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 944.906799] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] with excutils.save_and_reraise_exception() as ectxt: [ 944.906799] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.906799] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 944.906799] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] self.force_reraise() [ 944.906799] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.906799] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 944.907127] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] raise self.value [ 944.907127] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.907127] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 944.907127] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return f(*args, **kwargs) [ 944.907127] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.907127] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 944.907127] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return f(context, *args, **kwargs) [ 944.907127] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.907127] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 944.907127] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 944.907127] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.907127] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 944.907127] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] raise exception.InstanceNotFound(instance_id=uuid) [ 944.907127] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.907127] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] nova.exception.InstanceNotFound: Instance f989cbee-9d5c-459f-b7a0-bf2259dadbb0 could not be found. [ 944.907575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.907575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.907575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] During handling of the above exception, another exception occurred: [ 944.907575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.907575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Traceback (most recent call last): [ 944.907575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 944.907575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] ret = obj(*args, **kwargs) [ 944.907575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 944.907575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] exception_handler_v20(status_code, error_body) [ 944.907575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 944.907575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] raise client_exc(message=error_message, [ 944.907575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 944.907575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Neutron server returns request_ids: ['req-c313bb7e-6991-48e7-914f-97e4a4895cde'] [ 944.907575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.907882] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] During handling of the above exception, another exception occurred: [ 944.907882] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.907882] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Traceback (most recent call last): [ 944.907882] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 944.907882] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] self._deallocate_network(context, instance, requested_networks) [ 944.907882] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 944.907882] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] self.network_api.deallocate_for_instance( [ 944.907882] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 944.907882] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] data = neutron.list_ports(**search_opts) [ 944.907882] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 944.907882] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] ret = obj(*args, **kwargs) [ 944.907882] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 944.907882] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return self.list('ports', self.ports_path, retrieve_all, [ 944.909063] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 944.909063] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] ret = obj(*args, **kwargs) [ 944.909063] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 944.909063] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] for r in self._pagination(collection, path, **params): [ 944.909063] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 944.909063] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] res = self.get(path, params=params) [ 944.909063] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 944.909063] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] ret = obj(*args, **kwargs) [ 944.909063] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 944.909063] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return self.retry_request("GET", action, body=body, [ 944.909063] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 944.909063] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] ret = obj(*args, **kwargs) [ 944.909063] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 944.909575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] return self.do_request(method, action, body=body, [ 944.909575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 944.909575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] ret = obj(*args, **kwargs) [ 944.909575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 944.909575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] self._handle_fault_response(status_code, replybody, resp) [ 944.909575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 944.909575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] raise exception.Unauthorized() [ 944.909575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] nova.exception.Unauthorized: Not authorized. [ 944.909575] env[65680]: ERROR nova.compute.manager [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] [ 944.928098] env[65680]: DEBUG oslo_concurrency.lockutils [None req-b520e745-409f-44bb-a2d6-d3154c4251fb tempest-ImagesNegativeTestJSON-168801522 tempest-ImagesNegativeTestJSON-168801522-project-member] Lock "f989cbee-9d5c-459f-b7a0-bf2259dadbb0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 312.488s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 944.941273] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 945.002581] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 945.002860] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 945.004453] env[65680]: INFO nova.compute.claims [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 945.063271] env[65680]: DEBUG nova.network.neutron [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Successfully created port: 908eef60-29d5-4d72-9c39-6c2782adcb09 {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 945.198877] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bf9d082-4c84-4fb9-adee-c007397212ca {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.207238] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81ba2963-f208-4493-a4f8-65b9a0e03456 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.240224] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-830a973b-bb8f-4724-89c5-0cfcae95a1e7 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.247954] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7309c3d-85a1-4a04-ab6e-77b4d91ef725 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.261961] env[65680]: DEBUG nova.compute.provider_tree [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 945.271604] env[65680]: DEBUG nova.scheduler.client.report [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 945.286706] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.284s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 945.287189] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 945.318661] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 945.318661] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Creating directory with path [datastore1] vmware_temp/a6c5ce95-a36e-4266-9e0e-787855c9d8ff/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 945.318661] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-78a6255c-293e-48da-a13f-e69d0d4a22da {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.321799] env[65680]: DEBUG nova.compute.utils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 945.323257] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 945.323445] env[65680]: DEBUG nova.network.neutron [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 945.331982] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 945.335638] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Created directory with path [datastore1] vmware_temp/a6c5ce95-a36e-4266-9e0e-787855c9d8ff/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 945.335831] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Fetch image to [datastore1] vmware_temp/a6c5ce95-a36e-4266-9e0e-787855c9d8ff/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 945.336033] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/a6c5ce95-a36e-4266-9e0e-787855c9d8ff/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 945.336939] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f580f565-1c78-4ce9-b376-e341fda734da {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.346219] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a71ba07-5d34-4770-9efa-b2b0af7e9fcb {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.355999] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e02eff1b-0686-48b7-8c3b-f21b38f21fd1 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.396527] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ba0d738-12ef-48d7-8df6-beb99f1b2aec {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.399892] env[65680]: DEBUG nova.compute.manager [req-ef92e4b6-3119-47de-9a5a-5a1ad40e6654 req-f94dae28-59d3-41db-998c-0f62f439ec1e service nova] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Received event network-vif-plugged-c953efe3-8348-4fd0-a558-0913fd2880d2 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 945.400436] env[65680]: DEBUG oslo_concurrency.lockutils [req-ef92e4b6-3119-47de-9a5a-5a1ad40e6654 req-f94dae28-59d3-41db-998c-0f62f439ec1e service nova] Acquiring lock "05ef6eca-eb64-43b3-8c7d-b5a230282a8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 945.400436] env[65680]: DEBUG oslo_concurrency.lockutils [req-ef92e4b6-3119-47de-9a5a-5a1ad40e6654 req-f94dae28-59d3-41db-998c-0f62f439ec1e service nova] Lock "05ef6eca-eb64-43b3-8c7d-b5a230282a8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 945.400608] env[65680]: DEBUG oslo_concurrency.lockutils [req-ef92e4b6-3119-47de-9a5a-5a1ad40e6654 req-f94dae28-59d3-41db-998c-0f62f439ec1e service nova] Lock "05ef6eca-eb64-43b3-8c7d-b5a230282a8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 945.400608] env[65680]: DEBUG nova.compute.manager [req-ef92e4b6-3119-47de-9a5a-5a1ad40e6654 req-f94dae28-59d3-41db-998c-0f62f439ec1e service nova] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] No waiting events found dispatching network-vif-plugged-c953efe3-8348-4fd0-a558-0913fd2880d2 {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 945.400758] env[65680]: WARNING nova.compute.manager [req-ef92e4b6-3119-47de-9a5a-5a1ad40e6654 req-f94dae28-59d3-41db-998c-0f62f439ec1e service nova] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Received unexpected event network-vif-plugged-c953efe3-8348-4fd0-a558-0913fd2880d2 for instance with vm_state building and task_state spawning. [ 945.406756] env[65680]: DEBUG oslo_vmware.api [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Task: {'id': task-2847925, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080894} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 945.408264] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 945.408442] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 945.408609] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 945.408767] env[65680]: INFO nova.compute.manager [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Took 0.62 seconds to destroy the instance on the hypervisor. [ 945.410657] env[65680]: DEBUG nova.compute.claims [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 945.410814] env[65680]: DEBUG oslo_concurrency.lockutils [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 945.411060] env[65680]: DEBUG oslo_concurrency.lockutils [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 945.413369] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b869949e-d3b1-4bef-85a0-d892d7878dce {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.417176] env[65680]: DEBUG nova.policy [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '65370eccd0d14c8ca37ee4ab40141956', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bb30ad84896d480885a85e4621656c6a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 945.435368] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 945.440639] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 945.441350] env[65680]: DEBUG oslo_concurrency.lockutils [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.030s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 945.442075] env[65680]: DEBUG nova.compute.utils [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Instance 40a7ee3c-8627-47f3-887e-31112586e799 could not be found. {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 945.443632] env[65680]: DEBUG nova.compute.manager [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Instance disappeared during build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 945.443789] env[65680]: DEBUG nova.compute.manager [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 945.445379] env[65680]: DEBUG nova.compute.manager [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 945.445379] env[65680]: DEBUG nova.compute.manager [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 945.445379] env[65680]: DEBUG nova.network.neutron [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 945.461699] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 945.461938] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 945.462111] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 945.462297] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 945.462442] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 945.462581] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 945.462785] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 945.462963] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 945.463163] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 945.463327] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 945.463498] env[65680]: DEBUG nova.virt.hardware [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 945.464421] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bc98191-77e1-4e3f-b0c1-797363c0a334 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.471674] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02661c08-a923-4c23-a959-636317e67b88 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.496383] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 945.497142] env[65680]: ERROR nova.compute.manager [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 945.497142] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Traceback (most recent call last): [ 945.497142] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 945.497142] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 945.497142] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 945.497142] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] result = getattr(controller, method)(*args, **kwargs) [ 945.497142] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 945.497142] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return self._get(image_id) [ 945.497142] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 945.497142] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return RequestIdProxy(wrapped(*args, **kwargs)) [ 945.497142] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 945.497423] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] resp, body = self.http_client.get(url, headers=header) [ 945.497423] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 945.497423] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return self.request(url, 'GET', **kwargs) [ 945.497423] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 945.497423] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return self._handle_response(resp) [ 945.497423] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 945.497423] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] raise exc.from_response(resp, resp.content) [ 945.497423] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 945.497423] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 945.497423] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] During handling of the above exception, another exception occurred: [ 945.497423] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 945.497423] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Traceback (most recent call last): [ 945.497680] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 945.497680] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] yield resources [ 945.497680] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 945.497680] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] self.driver.spawn(context, instance, image_meta, [ 945.497680] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 945.497680] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] self._vmops.spawn(context, instance, image_meta, injected_files, [ 945.497680] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 945.497680] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] self._fetch_image_if_missing(context, vi) [ 945.497680] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 945.497680] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] image_fetch(context, vi, tmp_image_ds_loc) [ 945.497680] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 945.497680] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] images.fetch_image( [ 945.497680] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 945.497962] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] metadata = IMAGE_API.get(context, image_ref) [ 945.497962] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 945.497962] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return session.show(context, image_id, [ 945.497962] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 945.497962] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] _reraise_translated_image_exception(image_id) [ 945.497962] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 945.497962] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] raise new_exc.with_traceback(exc_trace) [ 945.497962] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 945.497962] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 945.497962] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 945.497962] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] result = getattr(controller, method)(*args, **kwargs) [ 945.497962] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 945.497962] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return self._get(image_id) [ 945.498381] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 945.498381] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return RequestIdProxy(wrapped(*args, **kwargs)) [ 945.498381] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 945.498381] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] resp, body = self.http_client.get(url, headers=header) [ 945.498381] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 945.498381] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return self.request(url, 'GET', **kwargs) [ 945.498381] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 945.498381] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return self._handle_response(resp) [ 945.498381] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 945.498381] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] raise exc.from_response(resp, resp.content) [ 945.498381] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 945.498381] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 945.498650] env[65680]: INFO nova.compute.manager [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Terminating instance [ 945.499183] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 945.499183] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 945.499756] env[65680]: DEBUG nova.compute.manager [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 945.499944] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 945.500180] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ab8254bc-3949-4caf-80af-d026542e8c2c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.503126] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1efe4e5-bdd5-4b96-95c0-24e8e490c575 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.510791] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 945.511064] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b1b6b6bd-da37-4780-a337-4f3b2214db71 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.515176] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 945.515176] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 945.515176] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-020a4fbb-9035-45f7-a31d-edc68608b810 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.519491] env[65680]: DEBUG oslo_vmware.api [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Waiting for the task: (returnval){ [ 945.519491] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]524b401f-1343-e28d-dae7-64fef31eaca5" [ 945.519491] env[65680]: _type = "Task" [ 945.519491] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 945.531999] env[65680]: DEBUG oslo_vmware.api [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]524b401f-1343-e28d-dae7-64fef31eaca5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 945.545497] env[65680]: DEBUG nova.network.neutron [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Successfully updated port: c953efe3-8348-4fd0-a558-0913fd2880d2 {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 945.556128] env[65680]: DEBUG oslo_concurrency.lockutils [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Acquiring lock "refresh_cache-05ef6eca-eb64-43b3-8c7d-b5a230282a8f" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 945.556327] env[65680]: DEBUG oslo_concurrency.lockutils [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Acquired lock "refresh_cache-05ef6eca-eb64-43b3-8c7d-b5a230282a8f" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 945.556485] env[65680]: DEBUG nova.network.neutron [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 945.582209] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 945.582405] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 945.582575] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Deleting the datastore file [datastore1] 2f6ce1b8-d869-4219-851a-43ae3ddd3816 {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 945.583121] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8a59a5d4-e038-47d6-9713-185bde4aef14 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.589785] env[65680]: DEBUG oslo_vmware.api [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Waiting for the task: (returnval){ [ 945.589785] env[65680]: value = "task-2847927" [ 945.589785] env[65680]: _type = "Task" [ 945.589785] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 945.598680] env[65680]: DEBUG oslo_vmware.api [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Task: {'id': task-2847927, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 945.679727] env[65680]: DEBUG neutronclient.v2_0.client [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=65680) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 945.681422] env[65680]: ERROR nova.compute.manager [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 945.681422] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Traceback (most recent call last): [ 945.681422] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 945.681422] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 945.681422] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 945.681422] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] result = getattr(controller, method)(*args, **kwargs) [ 945.681422] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 945.681422] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return self._get(image_id) [ 945.681422] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 945.681422] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return RequestIdProxy(wrapped(*args, **kwargs)) [ 945.681422] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 945.681422] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] resp, body = self.http_client.get(url, headers=header) [ 945.681943] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 945.681943] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return self.request(url, 'GET', **kwargs) [ 945.681943] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 945.681943] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return self._handle_response(resp) [ 945.681943] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 945.681943] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] raise exc.from_response(resp, resp.content) [ 945.681943] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 945.681943] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.681943] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] During handling of the above exception, another exception occurred: [ 945.681943] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.681943] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Traceback (most recent call last): [ 945.681943] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 945.683123] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] self.driver.spawn(context, instance, image_meta, [ 945.683123] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 945.683123] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] self._vmops.spawn(context, instance, image_meta, injected_files, [ 945.683123] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 945.683123] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] self._fetch_image_if_missing(context, vi) [ 945.683123] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 945.683123] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] image_fetch(context, vi, tmp_image_ds_loc) [ 945.683123] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 945.683123] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] images.fetch_image( [ 945.683123] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 945.683123] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] metadata = IMAGE_API.get(context, image_ref) [ 945.683123] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 945.683123] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return session.show(context, image_id, [ 945.683494] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 945.683494] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] _reraise_translated_image_exception(image_id) [ 945.683494] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 945.683494] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] raise new_exc.with_traceback(exc_trace) [ 945.683494] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 945.683494] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 945.683494] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 945.683494] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] result = getattr(controller, method)(*args, **kwargs) [ 945.683494] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 945.683494] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return self._get(image_id) [ 945.683494] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 945.683494] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return RequestIdProxy(wrapped(*args, **kwargs)) [ 945.683494] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 945.683771] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] resp, body = self.http_client.get(url, headers=header) [ 945.683771] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 945.683771] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return self.request(url, 'GET', **kwargs) [ 945.683771] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 945.683771] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return self._handle_response(resp) [ 945.683771] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 945.683771] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] raise exc.from_response(resp, resp.content) [ 945.683771] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 945.683771] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.683771] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] During handling of the above exception, another exception occurred: [ 945.683771] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.683771] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Traceback (most recent call last): [ 945.683771] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 945.684074] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] self._build_and_run_instance(context, instance, image, [ 945.684074] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 945.684074] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] with excutils.save_and_reraise_exception(): [ 945.684074] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 945.684074] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] self.force_reraise() [ 945.684074] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 945.684074] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] raise self.value [ 945.684074] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 945.684074] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] with self.rt.instance_claim(context, instance, node, allocs, [ 945.684074] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 945.684074] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] self.abort() [ 945.684074] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 945.684074] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 945.684357] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 945.684357] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return f(*args, **kwargs) [ 945.684357] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 945.684357] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] self._unset_instance_host_and_node(instance) [ 945.684357] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 945.684357] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] instance.save() [ 945.684357] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 945.684357] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] updates, result = self.indirection_api.object_action( [ 945.684357] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 945.684357] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return cctxt.call(context, 'object_action', objinst=objinst, [ 945.684357] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 945.684357] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] result = self.transport._send( [ 945.684619] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 945.684619] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return self._driver.send(target, ctxt, message, [ 945.684619] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 945.684619] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 945.684619] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 945.684619] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] raise result [ 945.684619] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] nova.exception_Remote.InstanceNotFound_Remote: Instance 40a7ee3c-8627-47f3-887e-31112586e799 could not be found. [ 945.684619] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Traceback (most recent call last): [ 945.684619] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.684619] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 945.684619] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return getattr(target, method)(*args, **kwargs) [ 945.684619] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.684619] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 945.684891] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return fn(self, *args, **kwargs) [ 945.684891] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.684891] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 945.684891] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] old_ref, inst_ref = db.instance_update_and_get_original( [ 945.684891] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.684891] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 945.684891] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return f(*args, **kwargs) [ 945.684891] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.684891] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 945.684891] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] with excutils.save_and_reraise_exception() as ectxt: [ 945.684891] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.684891] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 945.684891] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] self.force_reraise() [ 945.684891] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.684891] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 945.685231] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] raise self.value [ 945.685231] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.685231] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 945.685231] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return f(*args, **kwargs) [ 945.685231] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.685231] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 945.685231] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return f(context, *args, **kwargs) [ 945.685231] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.685231] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 945.685231] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 945.685231] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.685231] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 945.685231] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] raise exception.InstanceNotFound(instance_id=uuid) [ 945.685231] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.685231] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] nova.exception.InstanceNotFound: Instance 40a7ee3c-8627-47f3-887e-31112586e799 could not be found. [ 945.685573] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.685573] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.685573] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] During handling of the above exception, another exception occurred: [ 945.685573] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.685573] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Traceback (most recent call last): [ 945.685573] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 945.685573] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] ret = obj(*args, **kwargs) [ 945.685573] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 945.685573] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] exception_handler_v20(status_code, error_body) [ 945.685573] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 945.685573] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] raise client_exc(message=error_message, [ 945.685573] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 945.685573] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Neutron server returns request_ids: ['req-4ba31d66-f15d-4a7a-bb08-b6662b927796'] [ 945.685573] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.685877] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] During handling of the above exception, another exception occurred: [ 945.685877] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.685877] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Traceback (most recent call last): [ 945.685877] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 945.685877] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] self._deallocate_network(context, instance, requested_networks) [ 945.685877] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 945.685877] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] self.network_api.deallocate_for_instance( [ 945.685877] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 945.685877] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] data = neutron.list_ports(**search_opts) [ 945.685877] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 945.685877] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] ret = obj(*args, **kwargs) [ 945.685877] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 945.685877] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return self.list('ports', self.ports_path, retrieve_all, [ 945.687226] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 945.687226] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] ret = obj(*args, **kwargs) [ 945.687226] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 945.687226] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] for r in self._pagination(collection, path, **params): [ 945.687226] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 945.687226] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] res = self.get(path, params=params) [ 945.687226] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 945.687226] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] ret = obj(*args, **kwargs) [ 945.687226] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 945.687226] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return self.retry_request("GET", action, body=body, [ 945.687226] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 945.687226] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] ret = obj(*args, **kwargs) [ 945.687226] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 945.687709] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] return self.do_request(method, action, body=body, [ 945.687709] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 945.687709] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] ret = obj(*args, **kwargs) [ 945.687709] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 945.687709] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] self._handle_fault_response(status_code, replybody, resp) [ 945.687709] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 945.687709] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] raise exception.Unauthorized() [ 945.687709] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] nova.exception.Unauthorized: Not authorized. [ 945.687709] env[65680]: ERROR nova.compute.manager [instance: 40a7ee3c-8627-47f3-887e-31112586e799] [ 945.697451] env[65680]: DEBUG nova.network.neutron [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 945.708289] env[65680]: DEBUG oslo_concurrency.lockutils [None req-0345e453-3eb2-4b56-a0ea-f95b99004262 tempest-SecurityGroupsTestJSON-416226944 tempest-SecurityGroupsTestJSON-416226944-project-member] Lock "40a7ee3c-8627-47f3-887e-31112586e799" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 312.887s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 945.720924] env[65680]: DEBUG nova.compute.manager [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 945.771122] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 945.771551] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 945.773852] env[65680]: INFO nova.compute.claims [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 945.937602] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5610e6a0-9322-4391-ba10-7cb6911d3ab4 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.945752] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-758040c2-44e2-4629-85d1-134bc2de6caf {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.978824] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1287ebfd-1b66-4cdc-9a8b-479eed08e805 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.987098] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e5f2df4-f875-43d6-880d-12ad3bf5a658 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.004487] env[65680]: DEBUG nova.compute.provider_tree [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 946.016527] env[65680]: DEBUG nova.scheduler.client.report [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 946.031231] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 946.031575] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Creating directory with path [datastore1] vmware_temp/c3fa54e0-9bf4-4b35-975b-2a75d7eac6be/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 946.031842] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-909c90f5-abbd-4a26-9e91-2ef91b7482cb {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.035577] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 946.036087] env[65680]: DEBUG nova.compute.manager [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 946.051287] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Created directory with path [datastore1] vmware_temp/c3fa54e0-9bf4-4b35-975b-2a75d7eac6be/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 946.051513] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Fetch image to [datastore1] vmware_temp/c3fa54e0-9bf4-4b35-975b-2a75d7eac6be/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 946.051765] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/c3fa54e0-9bf4-4b35-975b-2a75d7eac6be/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 946.052449] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebbfa73f-620e-480f-942f-3d404d7b8ce6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.064114] env[65680]: DEBUG nova.network.neutron [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Successfully created port: 1dae8dd6-14d6-4b7e-b7b6-e9caf63340f9 {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 946.064114] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-266cb97e-85a6-4f61-a390-dac606e6f413 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.072548] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10e9fa1d-9832-4b32-90bc-9918a6d28851 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.113631] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d54df6b9-f640-4664-bd7b-c527b18bad62 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.118021] env[65680]: DEBUG nova.compute.utils [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 946.119369] env[65680]: DEBUG nova.compute.manager [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 946.119587] env[65680]: DEBUG nova.network.neutron [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 946.125976] env[65680]: DEBUG oslo_vmware.api [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Task: {'id': task-2847927, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078607} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 946.130966] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 946.130966] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 946.130966] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 946.130966] env[65680]: INFO nova.compute.manager [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Took 0.63 seconds to destroy the instance on the hypervisor. [ 946.132046] env[65680]: DEBUG nova.compute.claims [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 946.132255] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 946.132500] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 946.135900] env[65680]: DEBUG nova.compute.manager [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 946.138464] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8a25e5cf-d8fc-4520-b79b-00365e905dfc {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.160598] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 946.167386] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.032s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 946.167386] env[65680]: DEBUG nova.compute.utils [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Instance 2f6ce1b8-d869-4219-851a-43ae3ddd3816 could not be found. {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 946.168796] env[65680]: DEBUG nova.compute.manager [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Instance disappeared during build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 946.169155] env[65680]: DEBUG nova.compute.manager [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 946.169595] env[65680]: DEBUG nova.compute.manager [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 946.170228] env[65680]: DEBUG nova.compute.manager [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 946.170685] env[65680]: DEBUG nova.network.neutron [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 946.184325] env[65680]: DEBUG nova.policy [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0edb97aa13614bcd9ec4f858f9df3307', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1624dcd5d40b4484bc8a806cdcb8c090', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 946.215093] env[65680]: DEBUG nova.compute.manager [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 946.242283] env[65680]: DEBUG nova.virt.hardware [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 946.242283] env[65680]: DEBUG nova.virt.hardware [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 946.242283] env[65680]: DEBUG nova.virt.hardware [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 946.243093] env[65680]: DEBUG nova.virt.hardware [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 946.243093] env[65680]: DEBUG nova.virt.hardware [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 946.243093] env[65680]: DEBUG nova.virt.hardware [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 946.243093] env[65680]: DEBUG nova.virt.hardware [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 946.243093] env[65680]: DEBUG nova.virt.hardware [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 946.243660] env[65680]: DEBUG nova.virt.hardware [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 946.243660] env[65680]: DEBUG nova.virt.hardware [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 946.243660] env[65680]: DEBUG nova.virt.hardware [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 946.244115] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fd4a98e-0049-4f38-bd3f-03c03a4f3d21 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.247296] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 946.247927] env[65680]: ERROR nova.compute.manager [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 946.247927] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Traceback (most recent call last): [ 946.247927] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 946.247927] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 946.247927] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 946.247927] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] result = getattr(controller, method)(*args, **kwargs) [ 946.247927] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 946.247927] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return self._get(image_id) [ 946.247927] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 946.247927] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return RequestIdProxy(wrapped(*args, **kwargs)) [ 946.247927] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 946.248378] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] resp, body = self.http_client.get(url, headers=header) [ 946.248378] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 946.248378] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return self.request(url, 'GET', **kwargs) [ 946.248378] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 946.248378] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return self._handle_response(resp) [ 946.248378] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 946.248378] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] raise exc.from_response(resp, resp.content) [ 946.248378] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 946.248378] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 946.248378] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] During handling of the above exception, another exception occurred: [ 946.248378] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 946.248378] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Traceback (most recent call last): [ 946.248792] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 946.248792] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] yield resources [ 946.248792] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 946.248792] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] self.driver.spawn(context, instance, image_meta, [ 946.248792] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 946.248792] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] self._vmops.spawn(context, instance, image_meta, injected_files, [ 946.248792] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 946.248792] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] self._fetch_image_if_missing(context, vi) [ 946.248792] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 946.248792] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] image_fetch(context, vi, tmp_image_ds_loc) [ 946.248792] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 946.248792] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] images.fetch_image( [ 946.248792] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 946.249253] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] metadata = IMAGE_API.get(context, image_ref) [ 946.249253] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 946.249253] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return session.show(context, image_id, [ 946.249253] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 946.249253] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] _reraise_translated_image_exception(image_id) [ 946.249253] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 946.249253] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] raise new_exc.with_traceback(exc_trace) [ 946.249253] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 946.249253] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 946.249253] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 946.249253] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] result = getattr(controller, method)(*args, **kwargs) [ 946.249253] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 946.249253] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return self._get(image_id) [ 946.249716] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 946.249716] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return RequestIdProxy(wrapped(*args, **kwargs)) [ 946.249716] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 946.249716] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] resp, body = self.http_client.get(url, headers=header) [ 946.249716] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 946.249716] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return self.request(url, 'GET', **kwargs) [ 946.249716] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 946.249716] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return self._handle_response(resp) [ 946.249716] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 946.249716] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] raise exc.from_response(resp, resp.content) [ 946.249716] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 946.249716] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 946.250158] env[65680]: INFO nova.compute.manager [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Terminating instance [ 946.250237] env[65680]: DEBUG nova.compute.manager [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 946.250424] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 946.250702] env[65680]: DEBUG oslo_concurrency.lockutils [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 946.250920] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 946.252235] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb590e87-812d-4f25-84ef-f14ead8f7c8f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.255189] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f65e3d51-cc3b-4af0-a92f-7106a6fcab5e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.262046] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23c9275f-ba57-468d-adc6-522b52362254 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.266286] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 946.267316] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d7f389d0-0c85-4b71-b70d-a8222890e26c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.268719] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 946.268891] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 946.270234] env[65680]: DEBUG nova.network.neutron [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Updating instance_info_cache with network_info: [{"id": "c953efe3-8348-4fd0-a558-0913fd2880d2", "address": "fa:16:3e:3b:e2:ed", "network": {"id": "45f2c4d5-d2f7-443f-952f-d082cbb816c7", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1906146182-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f89b77f14b3b46e58e32fbb9c68c9ca5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a0b29c52-62b0-4a9e-8e1c-41cf6ac8b916", "external-id": "nsx-vlan-transportzone-143", "segmentation_id": 143, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc953efe3-83", "ovs_interfaceid": "c953efe3-8348-4fd0-a558-0913fd2880d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 946.272318] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-52cc48ec-1998-4e64-ad05-33ea03802768 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.288356] env[65680]: DEBUG oslo_vmware.api [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Waiting for the task: (returnval){ [ 946.288356] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52c352f4-f44e-c3b7-a593-9de66235a81d" [ 946.288356] env[65680]: _type = "Task" [ 946.288356] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 946.289452] env[65680]: DEBUG oslo_concurrency.lockutils [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Releasing lock "refresh_cache-05ef6eca-eb64-43b3-8c7d-b5a230282a8f" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 946.289724] env[65680]: DEBUG nova.compute.manager [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Instance network_info: |[{"id": "c953efe3-8348-4fd0-a558-0913fd2880d2", "address": "fa:16:3e:3b:e2:ed", "network": {"id": "45f2c4d5-d2f7-443f-952f-d082cbb816c7", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1906146182-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f89b77f14b3b46e58e32fbb9c68c9ca5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a0b29c52-62b0-4a9e-8e1c-41cf6ac8b916", "external-id": "nsx-vlan-transportzone-143", "segmentation_id": 143, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc953efe3-83", "ovs_interfaceid": "c953efe3-8348-4fd0-a558-0913fd2880d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 946.292786] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:3b:e2:ed', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a0b29c52-62b0-4a9e-8e1c-41cf6ac8b916', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c953efe3-8348-4fd0-a558-0913fd2880d2', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 946.300400] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Creating folder: Project (f89b77f14b3b46e58e32fbb9c68c9ca5). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 946.300686] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c20f4da0-5ea4-4ebd-a8a8-e9350d68c96c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.308608] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 946.308845] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Creating directory with path [datastore1] vmware_temp/5765cf5e-1304-4270-88dc-758bcc56a364/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 946.309385] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-97e84308-fdcf-4846-b2a4-6435ed40c0b3 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.314034] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Created folder: Project (f89b77f14b3b46e58e32fbb9c68c9ca5) in parent group-v572532. [ 946.314224] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Creating folder: Instances. Parent ref: group-v572592. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 946.314423] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8533af64-7fa4-4051-94dc-88a3d632f731 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.324437] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Created folder: Instances in parent group-v572592. [ 946.324672] env[65680]: DEBUG oslo.service.loopingcall [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 946.324886] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 946.325154] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-90c3de6b-60d4-461b-83dc-529c64228242 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.341781] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Created directory with path [datastore1] vmware_temp/5765cf5e-1304-4270-88dc-758bcc56a364/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 946.341781] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Fetch image to [datastore1] vmware_temp/5765cf5e-1304-4270-88dc-758bcc56a364/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 946.341935] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/5765cf5e-1304-4270-88dc-758bcc56a364/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 946.343298] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fa24bac-c62f-4554-8d83-7011b9dbd012 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.346983] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 946.346983] env[65680]: value = "task-2847931" [ 946.346983] env[65680]: _type = "Task" [ 946.346983] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 946.352685] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81ad5d5b-5c38-41b9-b3aa-0a0d2fa260cf {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.357814] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847931, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 946.358519] env[65680]: DEBUG neutronclient.v2_0.client [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=65680) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 946.360037] env[65680]: ERROR nova.compute.manager [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 946.360037] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Traceback (most recent call last): [ 946.360037] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 946.360037] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 946.360037] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 946.360037] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] result = getattr(controller, method)(*args, **kwargs) [ 946.360037] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 946.360037] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return self._get(image_id) [ 946.360037] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 946.360037] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return RequestIdProxy(wrapped(*args, **kwargs)) [ 946.360037] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 946.360374] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] resp, body = self.http_client.get(url, headers=header) [ 946.360374] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 946.360374] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return self.request(url, 'GET', **kwargs) [ 946.360374] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 946.360374] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return self._handle_response(resp) [ 946.360374] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 946.360374] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] raise exc.from_response(resp, resp.content) [ 946.360374] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 946.360374] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.360374] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] During handling of the above exception, another exception occurred: [ 946.360374] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.360374] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Traceback (most recent call last): [ 946.360693] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 946.360693] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] self.driver.spawn(context, instance, image_meta, [ 946.360693] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 946.360693] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] self._vmops.spawn(context, instance, image_meta, injected_files, [ 946.360693] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 946.360693] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] self._fetch_image_if_missing(context, vi) [ 946.360693] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 946.360693] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] image_fetch(context, vi, tmp_image_ds_loc) [ 946.360693] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 946.360693] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] images.fetch_image( [ 946.360693] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 946.360693] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] metadata = IMAGE_API.get(context, image_ref) [ 946.360693] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 946.361038] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return session.show(context, image_id, [ 946.361038] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 946.361038] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] _reraise_translated_image_exception(image_id) [ 946.361038] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 946.361038] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] raise new_exc.with_traceback(exc_trace) [ 946.361038] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 946.361038] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 946.361038] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 946.361038] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] result = getattr(controller, method)(*args, **kwargs) [ 946.361038] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 946.361038] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return self._get(image_id) [ 946.361038] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 946.361038] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return RequestIdProxy(wrapped(*args, **kwargs)) [ 946.361454] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 946.361454] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] resp, body = self.http_client.get(url, headers=header) [ 946.361454] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 946.361454] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return self.request(url, 'GET', **kwargs) [ 946.361454] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 946.361454] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return self._handle_response(resp) [ 946.361454] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 946.361454] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] raise exc.from_response(resp, resp.content) [ 946.361454] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 946.361454] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.361454] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] During handling of the above exception, another exception occurred: [ 946.361454] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.361454] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Traceback (most recent call last): [ 946.361825] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 946.361825] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] self._build_and_run_instance(context, instance, image, [ 946.361825] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 946.361825] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] with excutils.save_and_reraise_exception(): [ 946.361825] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 946.361825] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] self.force_reraise() [ 946.361825] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 946.361825] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] raise self.value [ 946.361825] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 946.361825] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] with self.rt.instance_claim(context, instance, node, allocs, [ 946.361825] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 946.361825] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] self.abort() [ 946.361825] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 946.362229] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 946.362229] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 946.362229] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return f(*args, **kwargs) [ 946.362229] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 946.362229] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] self._unset_instance_host_and_node(instance) [ 946.362229] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 946.362229] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] instance.save() [ 946.362229] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 946.362229] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] updates, result = self.indirection_api.object_action( [ 946.362229] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 946.362229] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return cctxt.call(context, 'object_action', objinst=objinst, [ 946.362229] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 946.362229] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] result = self.transport._send( [ 946.362594] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 946.362594] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return self._driver.send(target, ctxt, message, [ 946.362594] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 946.362594] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 946.362594] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 946.362594] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] raise result [ 946.362594] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] nova.exception_Remote.InstanceNotFound_Remote: Instance 2f6ce1b8-d869-4219-851a-43ae3ddd3816 could not be found. [ 946.362594] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Traceback (most recent call last): [ 946.362594] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.362594] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 946.362594] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return getattr(target, method)(*args, **kwargs) [ 946.362594] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.362594] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 946.362936] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return fn(self, *args, **kwargs) [ 946.362936] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.362936] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 946.362936] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] old_ref, inst_ref = db.instance_update_and_get_original( [ 946.362936] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.362936] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 946.362936] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return f(*args, **kwargs) [ 946.362936] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.362936] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 946.362936] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] with excutils.save_and_reraise_exception() as ectxt: [ 946.362936] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.362936] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 946.362936] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] self.force_reraise() [ 946.362936] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.362936] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 946.363393] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] raise self.value [ 946.363393] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.363393] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 946.363393] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return f(*args, **kwargs) [ 946.363393] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.363393] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 946.363393] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return f(context, *args, **kwargs) [ 946.363393] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.363393] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 946.363393] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 946.363393] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.363393] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 946.363393] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] raise exception.InstanceNotFound(instance_id=uuid) [ 946.363393] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.363393] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] nova.exception.InstanceNotFound: Instance 2f6ce1b8-d869-4219-851a-43ae3ddd3816 could not be found. [ 946.363851] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.363851] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.363851] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] During handling of the above exception, another exception occurred: [ 946.363851] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.363851] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Traceback (most recent call last): [ 946.363851] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 946.363851] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] ret = obj(*args, **kwargs) [ 946.363851] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 946.363851] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] exception_handler_v20(status_code, error_body) [ 946.363851] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 946.363851] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] raise client_exc(message=error_message, [ 946.363851] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 946.363851] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Neutron server returns request_ids: ['req-92889819-8c12-4156-9750-13ac28ad94d5'] [ 946.363851] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.364271] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] During handling of the above exception, another exception occurred: [ 946.364271] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.364271] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Traceback (most recent call last): [ 946.364271] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 946.364271] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] self._deallocate_network(context, instance, requested_networks) [ 946.364271] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 946.364271] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] self.network_api.deallocate_for_instance( [ 946.364271] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 946.364271] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] data = neutron.list_ports(**search_opts) [ 946.364271] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 946.364271] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] ret = obj(*args, **kwargs) [ 946.364271] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 946.364271] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return self.list('ports', self.ports_path, retrieve_all, [ 946.364617] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 946.364617] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] ret = obj(*args, **kwargs) [ 946.364617] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 946.364617] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] for r in self._pagination(collection, path, **params): [ 946.364617] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 946.364617] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] res = self.get(path, params=params) [ 946.364617] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 946.364617] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] ret = obj(*args, **kwargs) [ 946.364617] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 946.364617] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return self.retry_request("GET", action, body=body, [ 946.364617] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 946.364617] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] ret = obj(*args, **kwargs) [ 946.364617] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 946.364964] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] return self.do_request(method, action, body=body, [ 946.364964] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 946.364964] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] ret = obj(*args, **kwargs) [ 946.364964] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 946.364964] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] self._handle_fault_response(status_code, replybody, resp) [ 946.364964] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 946.364964] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] raise exception.Unauthorized() [ 946.364964] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] nova.exception.Unauthorized: Not authorized. [ 946.364964] env[65680]: ERROR nova.compute.manager [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] [ 946.364964] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 946.365276] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 946.365276] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Deleting the datastore file [datastore1] b163d5b8-b01c-4ace-96e7-56276ab4ba82 {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 946.365276] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2022373c-6300-44d5-bb72-1b2ebe6c2a5a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.372264] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e66e50a6-8133-4efa-bd6b-6689cbb1e97c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.379839] env[65680]: DEBUG oslo_vmware.api [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Waiting for the task: (returnval){ [ 946.379839] env[65680]: value = "task-2847932" [ 946.379839] env[65680]: _type = "Task" [ 946.379839] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 946.412434] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e1a9e417-ba68-48fc-b7d9-86f2090b0f69 tempest-ServerRescueNegativeTestJSON-17755879 tempest-ServerRescueNegativeTestJSON-17755879-project-member] Lock "2f6ce1b8-d869-4219-851a-43ae3ddd3816" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 312.854s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 946.413597] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9297c67-d29a-4742-8b20-188889beb025 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.420437] env[65680]: DEBUG oslo_vmware.api [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Task: {'id': task-2847932, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 946.424135] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0bb9be6c-a112-4831-a020-c705448b50f5 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.430508] env[65680]: DEBUG nova.compute.manager [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 946.449478] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 946.485095] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 946.485531] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 946.488159] env[65680]: INFO nova.compute.claims [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 946.526022] env[65680]: DEBUG nova.network.neutron [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Successfully created port: d6d14cac-0618-4f2a-b8a3-caa176d3931c {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 946.602449] env[65680]: DEBUG oslo_concurrency.lockutils [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 946.603295] env[65680]: ERROR nova.compute.manager [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 946.603295] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Traceback (most recent call last): [ 946.603295] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 946.603295] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 946.603295] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 946.603295] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] result = getattr(controller, method)(*args, **kwargs) [ 946.603295] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 946.603295] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return self._get(image_id) [ 946.603295] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 946.603295] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return RequestIdProxy(wrapped(*args, **kwargs)) [ 946.603295] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 946.603691] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] resp, body = self.http_client.get(url, headers=header) [ 946.603691] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 946.603691] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return self.request(url, 'GET', **kwargs) [ 946.603691] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 946.603691] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return self._handle_response(resp) [ 946.603691] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 946.603691] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] raise exc.from_response(resp, resp.content) [ 946.603691] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 946.603691] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 946.603691] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] During handling of the above exception, another exception occurred: [ 946.603691] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 946.603691] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Traceback (most recent call last): [ 946.604304] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 946.604304] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] yield resources [ 946.604304] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 946.604304] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] self.driver.spawn(context, instance, image_meta, [ 946.604304] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 946.604304] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 946.604304] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 946.604304] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] self._fetch_image_if_missing(context, vi) [ 946.604304] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 946.604304] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] image_fetch(context, vi, tmp_image_ds_loc) [ 946.604304] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 946.604304] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] images.fetch_image( [ 946.604304] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 946.604718] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] metadata = IMAGE_API.get(context, image_ref) [ 946.604718] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 946.604718] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return session.show(context, image_id, [ 946.604718] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 946.604718] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] _reraise_translated_image_exception(image_id) [ 946.604718] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 946.604718] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] raise new_exc.with_traceback(exc_trace) [ 946.604718] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 946.604718] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 946.604718] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 946.604718] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] result = getattr(controller, method)(*args, **kwargs) [ 946.604718] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 946.604718] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return self._get(image_id) [ 946.605215] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 946.605215] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return RequestIdProxy(wrapped(*args, **kwargs)) [ 946.605215] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 946.605215] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] resp, body = self.http_client.get(url, headers=header) [ 946.605215] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 946.605215] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return self.request(url, 'GET', **kwargs) [ 946.605215] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 946.605215] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return self._handle_response(resp) [ 946.605215] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 946.605215] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] raise exc.from_response(resp, resp.content) [ 946.605215] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 946.605215] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 946.605526] env[65680]: INFO nova.compute.manager [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Terminating instance [ 946.608806] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 946.609058] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 946.609706] env[65680]: DEBUG nova.compute.manager [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 946.609902] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 946.610148] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-db6f0ffa-1f74-48ef-a176-a33b18172115 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.612958] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0330fec7-86f5-409d-b307-29c4a8ecfcc3 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.622625] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 946.622893] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f2496a74-66bc-4123-8009-ce2b1d400c59 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.625495] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 946.625738] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 946.626784] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-50c853de-7b3d-4a56-8e8d-078295f075f4 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.634604] env[65680]: DEBUG oslo_vmware.api [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Waiting for the task: (returnval){ [ 946.634604] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]5206048e-6b2a-446b-ccc6-23d2fe1a355d" [ 946.634604] env[65680]: _type = "Task" [ 946.634604] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 946.645196] env[65680]: DEBUG oslo_vmware.api [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]5206048e-6b2a-446b-ccc6-23d2fe1a355d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 946.672873] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb7d28ec-e5b7-4e2b-8e67-a032a0a128ab {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.681075] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f798519f-eba5-44bc-bb94-455c686e63ae {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.685343] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 946.685559] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 946.685736] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Deleting the datastore file [datastore1] b935e1a7-1c77-4398-a964-cd7da312fc1b {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 946.686325] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b7462dbc-f0ce-4780-be9a-72382bd6e04a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.714137] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b948c69e-4999-4f19-90ce-b6f78d454024 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.716597] env[65680]: DEBUG oslo_vmware.api [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Waiting for the task: (returnval){ [ 946.716597] env[65680]: value = "task-2847934" [ 946.716597] env[65680]: _type = "Task" [ 946.716597] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 946.722705] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b81c98a0-1d44-4241-99c0-72d4fe337117 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.730014] env[65680]: DEBUG oslo_vmware.api [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Task: {'id': task-2847934, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 946.739353] env[65680]: DEBUG nova.compute.provider_tree [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 946.747930] env[65680]: DEBUG nova.scheduler.client.report [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 946.761224] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.276s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 946.761725] env[65680]: DEBUG nova.compute.manager [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 946.804870] env[65680]: DEBUG nova.compute.utils [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 946.806392] env[65680]: DEBUG nova.compute.manager [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 946.806574] env[65680]: DEBUG nova.network.neutron [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 946.814809] env[65680]: DEBUG nova.compute.manager [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 946.857080] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847931, 'name': CreateVM_Task, 'duration_secs': 0.317813} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 946.857306] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 946.857984] env[65680]: DEBUG oslo_concurrency.lockutils [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 946.858165] env[65680]: DEBUG oslo_concurrency.lockutils [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 946.858465] env[65680]: DEBUG oslo_concurrency.lockutils [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 946.858694] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d48bcc97-18eb-46e9-854c-fe4730bcf65d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.862870] env[65680]: DEBUG oslo_vmware.api [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Waiting for the task: (returnval){ [ 946.862870] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52b1df1c-42bf-4e26-40e8-87abfab1d72b" [ 946.862870] env[65680]: _type = "Task" [ 946.862870] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 946.867012] env[65680]: DEBUG nova.policy [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '114a607aaccf47adae68d99208bc3612', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a407e2bf97c14a7f88a7e8229e894d7a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 946.873198] env[65680]: DEBUG oslo_vmware.api [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52b1df1c-42bf-4e26-40e8-87abfab1d72b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 946.876616] env[65680]: DEBUG nova.compute.manager [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 946.888762] env[65680]: DEBUG oslo_vmware.api [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Task: {'id': task-2847932, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074795} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 946.889073] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 946.889480] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 946.889565] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 946.889769] env[65680]: INFO nova.compute.manager [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Took 0.64 seconds to destroy the instance on the hypervisor. [ 946.893996] env[65680]: DEBUG nova.compute.claims [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 946.893996] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 946.893996] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 946.898644] env[65680]: DEBUG nova.virt.hardware [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 946.898854] env[65680]: DEBUG nova.virt.hardware [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 946.899020] env[65680]: DEBUG nova.virt.hardware [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 946.899206] env[65680]: DEBUG nova.virt.hardware [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 946.899397] env[65680]: DEBUG nova.virt.hardware [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 946.899584] env[65680]: DEBUG nova.virt.hardware [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 946.899798] env[65680]: DEBUG nova.virt.hardware [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 946.899953] env[65680]: DEBUG nova.virt.hardware [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 946.900132] env[65680]: DEBUG nova.virt.hardware [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 946.900444] env[65680]: DEBUG nova.virt.hardware [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 946.900444] env[65680]: DEBUG nova.virt.hardware [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 946.901276] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-304ec7a2-0762-440b-9269-f8b220143fea {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.908206] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7e99be4-e47f-4a28-ba25-733037e98ecf {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.921948] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.028s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 946.922619] env[65680]: DEBUG nova.compute.utils [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Instance b163d5b8-b01c-4ace-96e7-56276ab4ba82 could not be found. {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 946.924026] env[65680]: DEBUG nova.compute.manager [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Instance disappeared during build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 946.924159] env[65680]: DEBUG nova.compute.manager [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 946.924332] env[65680]: DEBUG nova.compute.manager [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 946.924500] env[65680]: DEBUG nova.compute.manager [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 946.924659] env[65680]: DEBUG nova.network.neutron [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 947.038295] env[65680]: DEBUG neutronclient.v2_0.client [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=65680) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 947.040475] env[65680]: ERROR nova.compute.manager [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 947.040475] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Traceback (most recent call last): [ 947.040475] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 947.040475] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 947.040475] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 947.040475] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] result = getattr(controller, method)(*args, **kwargs) [ 947.040475] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 947.040475] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return self._get(image_id) [ 947.040475] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 947.040475] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return RequestIdProxy(wrapped(*args, **kwargs)) [ 947.040475] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 947.040475] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] resp, body = self.http_client.get(url, headers=header) [ 947.040778] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 947.040778] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return self.request(url, 'GET', **kwargs) [ 947.040778] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 947.040778] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return self._handle_response(resp) [ 947.040778] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 947.040778] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] raise exc.from_response(resp, resp.content) [ 947.040778] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 947.040778] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.040778] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] During handling of the above exception, another exception occurred: [ 947.040778] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.040778] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Traceback (most recent call last): [ 947.040778] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 947.041057] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] self.driver.spawn(context, instance, image_meta, [ 947.041057] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 947.041057] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] self._vmops.spawn(context, instance, image_meta, injected_files, [ 947.041057] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 947.041057] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] self._fetch_image_if_missing(context, vi) [ 947.041057] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 947.041057] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] image_fetch(context, vi, tmp_image_ds_loc) [ 947.041057] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 947.041057] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] images.fetch_image( [ 947.041057] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 947.041057] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] metadata = IMAGE_API.get(context, image_ref) [ 947.041057] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 947.041057] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return session.show(context, image_id, [ 947.041342] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 947.041342] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] _reraise_translated_image_exception(image_id) [ 947.041342] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 947.041342] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] raise new_exc.with_traceback(exc_trace) [ 947.041342] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 947.041342] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 947.041342] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 947.041342] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] result = getattr(controller, method)(*args, **kwargs) [ 947.041342] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 947.041342] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return self._get(image_id) [ 947.041342] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 947.041342] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return RequestIdProxy(wrapped(*args, **kwargs)) [ 947.041342] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 947.041612] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] resp, body = self.http_client.get(url, headers=header) [ 947.041612] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 947.041612] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return self.request(url, 'GET', **kwargs) [ 947.041612] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 947.041612] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return self._handle_response(resp) [ 947.041612] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 947.041612] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] raise exc.from_response(resp, resp.content) [ 947.041612] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 947.041612] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.041612] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] During handling of the above exception, another exception occurred: [ 947.041612] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.041612] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Traceback (most recent call last): [ 947.041612] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 947.041875] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] self._build_and_run_instance(context, instance, image, [ 947.041875] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 947.041875] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] with excutils.save_and_reraise_exception(): [ 947.041875] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 947.041875] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] self.force_reraise() [ 947.041875] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 947.041875] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] raise self.value [ 947.041875] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 947.041875] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] with self.rt.instance_claim(context, instance, node, allocs, [ 947.041875] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 947.041875] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] self.abort() [ 947.041875] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 947.041875] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 947.042179] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 947.042179] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return f(*args, **kwargs) [ 947.042179] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 947.042179] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] self._unset_instance_host_and_node(instance) [ 947.042179] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 947.042179] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] instance.save() [ 947.042179] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 947.042179] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] updates, result = self.indirection_api.object_action( [ 947.042179] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 947.042179] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return cctxt.call(context, 'object_action', objinst=objinst, [ 947.042179] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 947.042179] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] result = self.transport._send( [ 947.042447] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 947.042447] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return self._driver.send(target, ctxt, message, [ 947.042447] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 947.042447] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 947.042447] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 947.042447] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] raise result [ 947.042447] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] nova.exception_Remote.InstanceNotFound_Remote: Instance b163d5b8-b01c-4ace-96e7-56276ab4ba82 could not be found. [ 947.042447] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Traceback (most recent call last): [ 947.042447] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.042447] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 947.042447] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return getattr(target, method)(*args, **kwargs) [ 947.042447] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.042447] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 947.042724] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return fn(self, *args, **kwargs) [ 947.042724] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.042724] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 947.042724] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] old_ref, inst_ref = db.instance_update_and_get_original( [ 947.042724] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.042724] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 947.042724] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return f(*args, **kwargs) [ 947.042724] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.042724] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 947.042724] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] with excutils.save_and_reraise_exception() as ectxt: [ 947.042724] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.042724] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 947.042724] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] self.force_reraise() [ 947.042724] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.042724] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 947.043098] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] raise self.value [ 947.043098] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.043098] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 947.043098] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return f(*args, **kwargs) [ 947.043098] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.043098] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 947.043098] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return f(context, *args, **kwargs) [ 947.043098] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.043098] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 947.043098] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 947.043098] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.043098] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 947.043098] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] raise exception.InstanceNotFound(instance_id=uuid) [ 947.043098] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.043098] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] nova.exception.InstanceNotFound: Instance b163d5b8-b01c-4ace-96e7-56276ab4ba82 could not be found. [ 947.043417] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.043417] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.043417] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] During handling of the above exception, another exception occurred: [ 947.043417] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.043417] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Traceback (most recent call last): [ 947.043417] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 947.043417] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] ret = obj(*args, **kwargs) [ 947.043417] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 947.043417] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] exception_handler_v20(status_code, error_body) [ 947.043417] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 947.043417] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] raise client_exc(message=error_message, [ 947.043417] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 947.043417] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Neutron server returns request_ids: ['req-2b408f47-e06e-4687-9688-a4ac49dfaf7a'] [ 947.043417] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.043763] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] During handling of the above exception, another exception occurred: [ 947.043763] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.043763] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Traceback (most recent call last): [ 947.043763] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 947.043763] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] self._deallocate_network(context, instance, requested_networks) [ 947.043763] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 947.043763] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] self.network_api.deallocate_for_instance( [ 947.043763] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 947.043763] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] data = neutron.list_ports(**search_opts) [ 947.043763] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 947.043763] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] ret = obj(*args, **kwargs) [ 947.043763] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 947.043763] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return self.list('ports', self.ports_path, retrieve_all, [ 947.045060] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 947.045060] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] ret = obj(*args, **kwargs) [ 947.045060] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 947.045060] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] for r in self._pagination(collection, path, **params): [ 947.045060] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 947.045060] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] res = self.get(path, params=params) [ 947.045060] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 947.045060] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] ret = obj(*args, **kwargs) [ 947.045060] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 947.045060] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return self.retry_request("GET", action, body=body, [ 947.045060] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 947.045060] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] ret = obj(*args, **kwargs) [ 947.045060] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 947.045588] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] return self.do_request(method, action, body=body, [ 947.045588] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 947.045588] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] ret = obj(*args, **kwargs) [ 947.045588] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 947.045588] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] self._handle_fault_response(status_code, replybody, resp) [ 947.045588] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 947.045588] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] raise exception.Unauthorized() [ 947.045588] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] nova.exception.Unauthorized: Not authorized. [ 947.045588] env[65680]: ERROR nova.compute.manager [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] [ 947.065254] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7878fe51-0680-4ae0-833b-2f3f1162a198 tempest-ServerTagsTestJSON-2119772601 tempest-ServerTagsTestJSON-2119772601-project-member] Lock "b163d5b8-b01c-4ace-96e7-56276ab4ba82" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 312.309s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 947.074722] env[65680]: DEBUG nova.compute.manager [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 947.129951] env[65680]: DEBUG nova.network.neutron [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Successfully updated port: d6d14cac-0618-4f2a-b8a3-caa176d3931c {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 947.142495] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 947.143851] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 947.144480] env[65680]: INFO nova.compute.claims [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 947.151519] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Acquiring lock "refresh_cache-dd382edd-abe8-4764-a9d5-4144ef7d50b0" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 947.151981] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Acquired lock "refresh_cache-dd382edd-abe8-4764-a9d5-4144ef7d50b0" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 947.151981] env[65680]: DEBUG nova.network.neutron [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 947.153909] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 947.153909] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Creating directory with path [datastore1] vmware_temp/55bf25b8-2d0e-4924-9c2f-de2ed221e101/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 947.154284] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4d30e50c-7070-41dc-bdcd-67cf392ff3a6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.180488] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Created directory with path [datastore1] vmware_temp/55bf25b8-2d0e-4924-9c2f-de2ed221e101/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 947.180488] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Fetch image to [datastore1] vmware_temp/55bf25b8-2d0e-4924-9c2f-de2ed221e101/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 947.180488] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/55bf25b8-2d0e-4924-9c2f-de2ed221e101/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 947.180488] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be105004-92df-4e5a-ad87-884bc9ce9441 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.190643] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b14a11ce-e129-4119-b3e6-4244facf81f8 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.205620] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15b9aa66-1bce-48c8-8050-bdf668798b16 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.210771] env[65680]: DEBUG nova.network.neutron [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Successfully created port: 01907b62-4b40-4f64-8f92-89a1184281ff {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 947.213141] env[65680]: DEBUG nova.network.neutron [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 947.254251] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af23246c-8243-4f0c-af9e-bef3f6b50945 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.261076] env[65680]: DEBUG oslo_vmware.api [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Task: {'id': task-2847934, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074485} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 947.261076] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 947.261076] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 947.261076] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 947.261255] env[65680]: INFO nova.compute.manager [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Took 0.65 seconds to destroy the instance on the hypervisor. [ 947.264313] env[65680]: DEBUG nova.compute.claims [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 947.264524] env[65680]: DEBUG oslo_concurrency.lockutils [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 947.264759] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4b3934e5-1e5b-48e2-8594-f380ef8374e8 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.286277] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 947.365930] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1eb5cb9c-ac8c-48a2-9ea3-b64a8eaabc0f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.379486] env[65680]: DEBUG oslo_concurrency.lockutils [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 947.379731] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 947.379932] env[65680]: DEBUG oslo_concurrency.lockutils [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 947.381253] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61633cd3-4505-4845-90a3-0c6e247e2f38 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.413303] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 947.414145] env[65680]: ERROR nova.compute.manager [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 947.414145] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] Traceback (most recent call last): [ 947.414145] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 947.414145] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 947.414145] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 947.414145] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] result = getattr(controller, method)(*args, **kwargs) [ 947.414145] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 947.414145] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return self._get(image_id) [ 947.414145] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 947.414145] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return RequestIdProxy(wrapped(*args, **kwargs)) [ 947.414145] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 947.414453] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] resp, body = self.http_client.get(url, headers=header) [ 947.414453] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 947.414453] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return self.request(url, 'GET', **kwargs) [ 947.414453] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 947.414453] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return self._handle_response(resp) [ 947.414453] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 947.414453] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] raise exc.from_response(resp, resp.content) [ 947.414453] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 947.414453] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 947.414453] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] During handling of the above exception, another exception occurred: [ 947.414453] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 947.414453] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] Traceback (most recent call last): [ 947.414719] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 947.414719] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] yield resources [ 947.414719] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 947.414719] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] self.driver.spawn(context, instance, image_meta, [ 947.414719] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 947.414719] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] self._vmops.spawn(context, instance, image_meta, injected_files, [ 947.414719] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 947.414719] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] self._fetch_image_if_missing(context, vi) [ 947.414719] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 947.414719] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] image_fetch(context, vi, tmp_image_ds_loc) [ 947.414719] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 947.414719] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] images.fetch_image( [ 947.414719] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 947.415018] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] metadata = IMAGE_API.get(context, image_ref) [ 947.415018] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 947.415018] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return session.show(context, image_id, [ 947.415018] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 947.415018] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] _reraise_translated_image_exception(image_id) [ 947.415018] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 947.415018] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] raise new_exc.with_traceback(exc_trace) [ 947.415018] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 947.415018] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 947.415018] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 947.415018] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] result = getattr(controller, method)(*args, **kwargs) [ 947.415018] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 947.415018] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return self._get(image_id) [ 947.415306] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 947.415306] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return RequestIdProxy(wrapped(*args, **kwargs)) [ 947.415306] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 947.415306] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] resp, body = self.http_client.get(url, headers=header) [ 947.415306] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 947.415306] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return self.request(url, 'GET', **kwargs) [ 947.415306] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 947.415306] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return self._handle_response(resp) [ 947.415306] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 947.415306] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] raise exc.from_response(resp, resp.content) [ 947.415306] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 947.415306] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 947.415579] env[65680]: INFO nova.compute.manager [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Terminating instance [ 947.416441] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-627655e5-dd6b-469c-b478-1f02b777ee2e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.418980] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 947.419198] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 947.419811] env[65680]: DEBUG nova.compute.manager [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 947.419997] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 947.420225] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4762f7ab-e0f6-4187-92b1-06071f4677d3 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.422532] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6739d4a2-75c4-4f19-9ca8-525167879c04 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.431950] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6554c88-13a8-4960-bf40-eab3f67b138b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.435969] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 947.437005] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ba7ce1ba-a641-4e2e-9bc3-1802b0ed2c22 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.438412] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 947.438585] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 947.439552] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7e955a1c-7496-4795-863c-202c4196c391 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.449716] env[65680]: DEBUG nova.compute.provider_tree [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 947.453513] env[65680]: DEBUG oslo_vmware.api [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Waiting for the task: (returnval){ [ 947.453513] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52479335-1415-6e9a-cede-137f53c30df4" [ 947.453513] env[65680]: _type = "Task" [ 947.453513] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 947.460650] env[65680]: DEBUG oslo_vmware.api [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52479335-1415-6e9a-cede-137f53c30df4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 947.461988] env[65680]: DEBUG nova.scheduler.client.report [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 947.481551] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.339s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 947.482090] env[65680]: DEBUG nova.compute.manager [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 947.484670] env[65680]: DEBUG oslo_concurrency.lockutils [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.220s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 947.511712] env[65680]: DEBUG oslo_concurrency.lockutils [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.027s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 947.512554] env[65680]: DEBUG nova.compute.utils [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Instance b935e1a7-1c77-4398-a964-cd7da312fc1b could not be found. {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 947.514226] env[65680]: DEBUG nova.compute.manager [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Instance disappeared during build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 947.514396] env[65680]: DEBUG nova.compute.manager [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 947.514556] env[65680]: DEBUG nova.compute.manager [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 947.514725] env[65680]: DEBUG nova.compute.manager [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 947.514883] env[65680]: DEBUG nova.network.neutron [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 947.518477] env[65680]: DEBUG nova.compute.utils [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 947.520075] env[65680]: DEBUG nova.compute.manager [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 947.520247] env[65680]: DEBUG nova.network.neutron [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 947.529901] env[65680]: DEBUG nova.compute.manager [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 947.607600] env[65680]: DEBUG nova.compute.manager [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 947.655202] env[65680]: DEBUG nova.policy [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7d01d59f2f4f40d0b577636ba61abb2d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8004b2d89fa641f8881699a2300ecf4f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 947.673257] env[65680]: DEBUG nova.virt.hardware [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 947.673520] env[65680]: DEBUG nova.virt.hardware [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 947.673671] env[65680]: DEBUG nova.virt.hardware [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 947.673845] env[65680]: DEBUG nova.virt.hardware [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 947.673987] env[65680]: DEBUG nova.virt.hardware [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 947.674194] env[65680]: DEBUG nova.virt.hardware [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 947.674415] env[65680]: DEBUG nova.virt.hardware [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 947.674567] env[65680]: DEBUG nova.virt.hardware [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 947.674728] env[65680]: DEBUG nova.virt.hardware [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 947.674883] env[65680]: DEBUG nova.virt.hardware [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 947.675061] env[65680]: DEBUG nova.virt.hardware [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 947.678411] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1f5a4e5-549f-4a05-b1d3-b4cc30429230 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.684059] env[65680]: DEBUG neutronclient.v2_0.client [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=65680) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 947.685799] env[65680]: ERROR nova.compute.manager [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 947.685799] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Traceback (most recent call last): [ 947.685799] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 947.685799] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 947.685799] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 947.685799] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] result = getattr(controller, method)(*args, **kwargs) [ 947.685799] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 947.685799] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return self._get(image_id) [ 947.685799] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 947.685799] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return RequestIdProxy(wrapped(*args, **kwargs)) [ 947.685799] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 947.686074] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] resp, body = self.http_client.get(url, headers=header) [ 947.686074] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 947.686074] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return self.request(url, 'GET', **kwargs) [ 947.686074] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 947.686074] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return self._handle_response(resp) [ 947.686074] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 947.686074] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] raise exc.from_response(resp, resp.content) [ 947.686074] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 947.686074] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.686074] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] During handling of the above exception, another exception occurred: [ 947.686074] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.686074] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Traceback (most recent call last): [ 947.686478] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 947.686478] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] self.driver.spawn(context, instance, image_meta, [ 947.686478] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 947.686478] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 947.686478] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 947.686478] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] self._fetch_image_if_missing(context, vi) [ 947.686478] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 947.686478] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] image_fetch(context, vi, tmp_image_ds_loc) [ 947.686478] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 947.686478] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] images.fetch_image( [ 947.686478] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 947.686478] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] metadata = IMAGE_API.get(context, image_ref) [ 947.686478] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 947.688171] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return session.show(context, image_id, [ 947.688171] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 947.688171] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] _reraise_translated_image_exception(image_id) [ 947.688171] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 947.688171] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] raise new_exc.with_traceback(exc_trace) [ 947.688171] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 947.688171] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 947.688171] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 947.688171] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] result = getattr(controller, method)(*args, **kwargs) [ 947.688171] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 947.688171] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return self._get(image_id) [ 947.688171] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 947.688171] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return RequestIdProxy(wrapped(*args, **kwargs)) [ 947.688744] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 947.688744] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] resp, body = self.http_client.get(url, headers=header) [ 947.688744] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 947.688744] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return self.request(url, 'GET', **kwargs) [ 947.688744] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 947.688744] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return self._handle_response(resp) [ 947.688744] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 947.688744] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] raise exc.from_response(resp, resp.content) [ 947.688744] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 947.688744] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.688744] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] During handling of the above exception, another exception occurred: [ 947.688744] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.688744] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Traceback (most recent call last): [ 947.689061] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 947.689061] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] self._build_and_run_instance(context, instance, image, [ 947.689061] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 947.689061] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] with excutils.save_and_reraise_exception(): [ 947.689061] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 947.689061] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] self.force_reraise() [ 947.689061] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 947.689061] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] raise self.value [ 947.689061] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 947.689061] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] with self.rt.instance_claim(context, instance, node, allocs, [ 947.689061] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 947.689061] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] self.abort() [ 947.689061] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 947.689368] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 947.689368] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 947.689368] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return f(*args, **kwargs) [ 947.689368] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 947.689368] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] self._unset_instance_host_and_node(instance) [ 947.689368] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 947.689368] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] instance.save() [ 947.689368] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 947.689368] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] updates, result = self.indirection_api.object_action( [ 947.689368] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 947.689368] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return cctxt.call(context, 'object_action', objinst=objinst, [ 947.689368] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 947.689368] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] result = self.transport._send( [ 947.689682] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 947.689682] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return self._driver.send(target, ctxt, message, [ 947.689682] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 947.689682] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 947.689682] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 947.689682] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] raise result [ 947.689682] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] nova.exception_Remote.InstanceNotFound_Remote: Instance b935e1a7-1c77-4398-a964-cd7da312fc1b could not be found. [ 947.689682] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Traceback (most recent call last): [ 947.689682] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.689682] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 947.689682] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return getattr(target, method)(*args, **kwargs) [ 947.689682] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.689682] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 947.690017] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return fn(self, *args, **kwargs) [ 947.690017] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.690017] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 947.690017] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] old_ref, inst_ref = db.instance_update_and_get_original( [ 947.690017] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.690017] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 947.690017] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return f(*args, **kwargs) [ 947.690017] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.690017] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 947.690017] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] with excutils.save_and_reraise_exception() as ectxt: [ 947.690017] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.690017] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 947.690017] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] self.force_reraise() [ 947.690017] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.690017] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 947.690404] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] raise self.value [ 947.690404] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.690404] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 947.690404] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return f(*args, **kwargs) [ 947.690404] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.690404] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 947.690404] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return f(context, *args, **kwargs) [ 947.690404] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.690404] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 947.690404] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 947.690404] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.690404] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 947.690404] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] raise exception.InstanceNotFound(instance_id=uuid) [ 947.690404] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.690404] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] nova.exception.InstanceNotFound: Instance b935e1a7-1c77-4398-a964-cd7da312fc1b could not be found. [ 947.690945] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.690945] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.690945] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] During handling of the above exception, another exception occurred: [ 947.690945] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.690945] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Traceback (most recent call last): [ 947.690945] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 947.690945] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] ret = obj(*args, **kwargs) [ 947.690945] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 947.690945] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] exception_handler_v20(status_code, error_body) [ 947.690945] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 947.690945] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] raise client_exc(message=error_message, [ 947.690945] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 947.690945] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Neutron server returns request_ids: ['req-04d37897-f1f4-4307-8e49-5ff9ebb8448b'] [ 947.690945] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.691602] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] During handling of the above exception, another exception occurred: [ 947.691602] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.691602] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Traceback (most recent call last): [ 947.691602] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 947.691602] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] self._deallocate_network(context, instance, requested_networks) [ 947.691602] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 947.691602] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] self.network_api.deallocate_for_instance( [ 947.691602] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 947.691602] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] data = neutron.list_ports(**search_opts) [ 947.691602] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 947.691602] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] ret = obj(*args, **kwargs) [ 947.691602] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 947.691602] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return self.list('ports', self.ports_path, retrieve_all, [ 947.691921] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 947.691921] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] ret = obj(*args, **kwargs) [ 947.691921] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 947.691921] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] for r in self._pagination(collection, path, **params): [ 947.691921] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 947.691921] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] res = self.get(path, params=params) [ 947.691921] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 947.691921] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] ret = obj(*args, **kwargs) [ 947.691921] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 947.691921] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return self.retry_request("GET", action, body=body, [ 947.691921] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 947.691921] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] ret = obj(*args, **kwargs) [ 947.691921] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 947.692443] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] return self.do_request(method, action, body=body, [ 947.692443] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 947.692443] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] ret = obj(*args, **kwargs) [ 947.692443] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 947.692443] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] self._handle_fault_response(status_code, replybody, resp) [ 947.692443] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 947.692443] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] raise exception.Unauthorized() [ 947.692443] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] nova.exception.Unauthorized: Not authorized. [ 947.692443] env[65680]: ERROR nova.compute.manager [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] [ 947.692443] env[65680]: DEBUG nova.compute.manager [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Received event network-changed-c953efe3-8348-4fd0-a558-0913fd2880d2 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 947.692741] env[65680]: DEBUG nova.compute.manager [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Refreshing instance network info cache due to event network-changed-c953efe3-8348-4fd0-a558-0913fd2880d2. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 947.692741] env[65680]: DEBUG oslo_concurrency.lockutils [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] Acquiring lock "refresh_cache-05ef6eca-eb64-43b3-8c7d-b5a230282a8f" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 947.692741] env[65680]: DEBUG oslo_concurrency.lockutils [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] Acquired lock "refresh_cache-05ef6eca-eb64-43b3-8c7d-b5a230282a8f" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 947.692741] env[65680]: DEBUG nova.network.neutron [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Refreshing network info cache for port c953efe3-8348-4fd0-a558-0913fd2880d2 {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 947.695911] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7cc4c6a-7fcd-4b78-95e4-12851f25347a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.719089] env[65680]: DEBUG oslo_concurrency.lockutils [None req-adacf9c8-1648-4db1-8510-96139e41c015 tempest-ServerAddressesNegativeTestJSON-1981819581 tempest-ServerAddressesNegativeTestJSON-1981819581-project-member] Lock "b935e1a7-1c77-4398-a964-cd7da312fc1b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 312.085s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 947.746991] env[65680]: DEBUG nova.network.neutron [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Updating instance_info_cache with network_info: [{"id": "d6d14cac-0618-4f2a-b8a3-caa176d3931c", "address": "fa:16:3e:9b:f1:56", "network": {"id": "720dcf80-b0c2-4e54-948f-b3cfe1e047c4", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-8048602-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1624dcd5d40b4484bc8a806cdcb8c090", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b7d09e9-a3dd-4d89-b9dd-2814f5f6dd5d", "external-id": "nsx-vlan-transportzone-591", "segmentation_id": 591, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd6d14cac-06", "ovs_interfaceid": "d6d14cac-0618-4f2a-b8a3-caa176d3931c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 947.757443] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Releasing lock "refresh_cache-dd382edd-abe8-4764-a9d5-4144ef7d50b0" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 947.757729] env[65680]: DEBUG nova.compute.manager [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Instance network_info: |[{"id": "d6d14cac-0618-4f2a-b8a3-caa176d3931c", "address": "fa:16:3e:9b:f1:56", "network": {"id": "720dcf80-b0c2-4e54-948f-b3cfe1e047c4", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-8048602-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1624dcd5d40b4484bc8a806cdcb8c090", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b7d09e9-a3dd-4d89-b9dd-2814f5f6dd5d", "external-id": "nsx-vlan-transportzone-591", "segmentation_id": 591, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd6d14cac-06", "ovs_interfaceid": "d6d14cac-0618-4f2a-b8a3-caa176d3931c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 947.758094] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9b:f1:56', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3b7d09e9-a3dd-4d89-b9dd-2814f5f6dd5d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd6d14cac-0618-4f2a-b8a3-caa176d3931c', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 947.765522] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Creating folder: Project (1624dcd5d40b4484bc8a806cdcb8c090). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 947.766167] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ed6a7e2b-2efb-4fa2-9cf1-b62423045d10 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.777184] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Created folder: Project (1624dcd5d40b4484bc8a806cdcb8c090) in parent group-v572532. [ 947.777418] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Creating folder: Instances. Parent ref: group-v572595. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 947.777588] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b6bc35b3-26ab-465e-8453-bc7f06be6dcd {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.787957] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Created folder: Instances in parent group-v572595. [ 947.788196] env[65680]: DEBUG oslo.service.loopingcall [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 947.788374] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 947.788562] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3135b77f-c1eb-4cd2-a834-e8d941cd954f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.808322] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 947.808322] env[65680]: value = "task-2847938" [ 947.808322] env[65680]: _type = "Task" [ 947.808322] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 947.815883] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847938, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 947.964595] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 947.964905] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Creating directory with path [datastore1] vmware_temp/7a8853e1-4906-4f28-bd7f-2ca99183912a/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 947.965415] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c788161e-9a5a-4158-ba4d-5c9f40050bec {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.986681] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Created directory with path [datastore1] vmware_temp/7a8853e1-4906-4f28-bd7f-2ca99183912a/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 947.986915] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Fetch image to [datastore1] vmware_temp/7a8853e1-4906-4f28-bd7f-2ca99183912a/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 947.987062] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/7a8853e1-4906-4f28-bd7f-2ca99183912a/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 947.987878] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f2c2c59-3c23-42a0-9019-fa56a0bdc0fc {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.001498] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1a45782-efb6-4a57-b6e6-6ff380f19b0f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.016720] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e367c83c-80d7-4b7b-8bdc-1333b139baa3 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.055422] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78eeef4c-3c7d-46d2-b4f4-51b76cc61c36 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.061560] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b1d9ef19-fc2c-4408-985a-b872781125fb {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.081177] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 948.127900] env[65680]: DEBUG nova.network.neutron [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Successfully updated port: 908eef60-29d5-4d72-9c39-6c2782adcb09 {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 948.133457] env[65680]: DEBUG oslo_vmware.rw_handles [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7a8853e1-4906-4f28-bd7f-2ca99183912a/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 948.191861] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Acquiring lock "refresh_cache-8b747838-fcd0-494c-bd5a-0e5b1950a44e" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 948.192098] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Acquired lock "refresh_cache-8b747838-fcd0-494c-bd5a-0e5b1950a44e" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 948.192286] env[65680]: DEBUG nova.network.neutron [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 948.195785] env[65680]: DEBUG oslo_vmware.rw_handles [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Completed reading data from the image iterator. {{(pid=65680) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 948.196034] env[65680]: DEBUG oslo_vmware.rw_handles [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7a8853e1-4906-4f28-bd7f-2ca99183912a/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 948.227339] env[65680]: DEBUG nova.network.neutron [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 948.245802] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 948.246317] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 948.246568] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Deleting the datastore file [datastore1] cb739449-a329-41b8-964c-8c9db383e846 {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 948.246860] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1e959b3e-c186-40b1-ae2f-088e9caa2e98 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.255791] env[65680]: DEBUG oslo_vmware.api [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Waiting for the task: (returnval){ [ 948.255791] env[65680]: value = "task-2847939" [ 948.255791] env[65680]: _type = "Task" [ 948.255791] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 948.264695] env[65680]: DEBUG oslo_vmware.api [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Task: {'id': task-2847939, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 948.317910] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847938, 'name': CreateVM_Task, 'duration_secs': 0.33469} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 948.318139] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 948.318808] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 948.319036] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 948.319831] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 948.320025] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fa62fe5e-76c9-4a29-a058-03c41117aff1 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.326096] env[65680]: DEBUG oslo_vmware.api [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Waiting for the task: (returnval){ [ 948.326096] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]522fea4e-01a3-7072-70a7-5d4067b55fb8" [ 948.326096] env[65680]: _type = "Task" [ 948.326096] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 948.335873] env[65680]: DEBUG oslo_vmware.api [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]522fea4e-01a3-7072-70a7-5d4067b55fb8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 948.437542] env[65680]: DEBUG nova.network.neutron [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Updating instance_info_cache with network_info: [{"id": "908eef60-29d5-4d72-9c39-6c2782adcb09", "address": "fa:16:3e:67:2e:de", "network": {"id": "51e431d8-d398-4406-9103-4b339ab2e1ea", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1099890668-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bb30ad84896d480885a85e4621656c6a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fe38bb7e-8bcb-419d-868f-0dc105c69651", "external-id": "nsx-vlan-transportzone-432", "segmentation_id": 432, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap908eef60-29", "ovs_interfaceid": "908eef60-29d5-4d72-9c39-6c2782adcb09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 948.450022] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Releasing lock "refresh_cache-8b747838-fcd0-494c-bd5a-0e5b1950a44e" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 948.450022] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Instance network_info: |[{"id": "908eef60-29d5-4d72-9c39-6c2782adcb09", "address": "fa:16:3e:67:2e:de", "network": {"id": "51e431d8-d398-4406-9103-4b339ab2e1ea", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1099890668-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bb30ad84896d480885a85e4621656c6a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fe38bb7e-8bcb-419d-868f-0dc105c69651", "external-id": "nsx-vlan-transportzone-432", "segmentation_id": 432, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap908eef60-29", "ovs_interfaceid": "908eef60-29d5-4d72-9c39-6c2782adcb09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 948.450277] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:67:2e:de', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'fe38bb7e-8bcb-419d-868f-0dc105c69651', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '908eef60-29d5-4d72-9c39-6c2782adcb09', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 948.458150] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Creating folder: Project (bb30ad84896d480885a85e4621656c6a). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 948.461448] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a1ded6c7-cb6f-466a-8531-d3c2e00eebe7 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.475041] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Created folder: Project (bb30ad84896d480885a85e4621656c6a) in parent group-v572532. [ 948.475041] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Creating folder: Instances. Parent ref: group-v572598. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 948.475041] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bd79a1a4-4113-41cd-854b-a218c56ae40d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.487119] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Created folder: Instances in parent group-v572598. [ 948.487119] env[65680]: DEBUG oslo.service.loopingcall [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 948.487119] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 948.487119] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7dfb6f06-96a5-48ae-987e-b44a68230059 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.506499] env[65680]: DEBUG nova.compute.manager [req-405af498-1855-469e-803a-565b324981d2 req-43ae264a-ffae-4ec0-8a57-28047ebdfe17 service nova] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Received event network-vif-plugged-01907b62-4b40-4f64-8f92-89a1184281ff {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 948.506887] env[65680]: DEBUG oslo_concurrency.lockutils [req-405af498-1855-469e-803a-565b324981d2 req-43ae264a-ffae-4ec0-8a57-28047ebdfe17 service nova] Acquiring lock "c9230f1c-72ea-4f62-be9f-949def49c5f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 948.507437] env[65680]: DEBUG oslo_concurrency.lockutils [req-405af498-1855-469e-803a-565b324981d2 req-43ae264a-ffae-4ec0-8a57-28047ebdfe17 service nova] Lock "c9230f1c-72ea-4f62-be9f-949def49c5f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 948.507744] env[65680]: DEBUG oslo_concurrency.lockutils [req-405af498-1855-469e-803a-565b324981d2 req-43ae264a-ffae-4ec0-8a57-28047ebdfe17 service nova] Lock "c9230f1c-72ea-4f62-be9f-949def49c5f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 948.508045] env[65680]: DEBUG nova.compute.manager [req-405af498-1855-469e-803a-565b324981d2 req-43ae264a-ffae-4ec0-8a57-28047ebdfe17 service nova] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] No waiting events found dispatching network-vif-plugged-01907b62-4b40-4f64-8f92-89a1184281ff {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 948.508344] env[65680]: WARNING nova.compute.manager [req-405af498-1855-469e-803a-565b324981d2 req-43ae264a-ffae-4ec0-8a57-28047ebdfe17 service nova] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Received unexpected event network-vif-plugged-01907b62-4b40-4f64-8f92-89a1184281ff for instance with vm_state building and task_state spawning. [ 948.516044] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 948.516044] env[65680]: value = "task-2847942" [ 948.516044] env[65680]: _type = "Task" [ 948.516044] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 948.522147] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847942, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 948.638701] env[65680]: DEBUG nova.network.neutron [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Updated VIF entry in instance network info cache for port c953efe3-8348-4fd0-a558-0913fd2880d2. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 948.639133] env[65680]: DEBUG nova.network.neutron [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Updating instance_info_cache with network_info: [{"id": "c953efe3-8348-4fd0-a558-0913fd2880d2", "address": "fa:16:3e:3b:e2:ed", "network": {"id": "45f2c4d5-d2f7-443f-952f-d082cbb816c7", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1906146182-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f89b77f14b3b46e58e32fbb9c68c9ca5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a0b29c52-62b0-4a9e-8e1c-41cf6ac8b916", "external-id": "nsx-vlan-transportzone-143", "segmentation_id": 143, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc953efe3-83", "ovs_interfaceid": "c953efe3-8348-4fd0-a558-0913fd2880d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 948.648700] env[65680]: DEBUG oslo_concurrency.lockutils [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] Releasing lock "refresh_cache-05ef6eca-eb64-43b3-8c7d-b5a230282a8f" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 948.648969] env[65680]: DEBUG nova.compute.manager [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Received event network-vif-plugged-d6d14cac-0618-4f2a-b8a3-caa176d3931c {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 948.649213] env[65680]: DEBUG oslo_concurrency.lockutils [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] Acquiring lock "dd382edd-abe8-4764-a9d5-4144ef7d50b0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 948.649424] env[65680]: DEBUG oslo_concurrency.lockutils [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] Lock "dd382edd-abe8-4764-a9d5-4144ef7d50b0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 948.649612] env[65680]: DEBUG oslo_concurrency.lockutils [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] Lock "dd382edd-abe8-4764-a9d5-4144ef7d50b0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 948.649797] env[65680]: DEBUG nova.compute.manager [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] No waiting events found dispatching network-vif-plugged-d6d14cac-0618-4f2a-b8a3-caa176d3931c {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 948.649977] env[65680]: WARNING nova.compute.manager [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Received unexpected event network-vif-plugged-d6d14cac-0618-4f2a-b8a3-caa176d3931c for instance with vm_state building and task_state spawning. [ 948.650169] env[65680]: DEBUG nova.compute.manager [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Received event network-changed-d6d14cac-0618-4f2a-b8a3-caa176d3931c {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 948.650332] env[65680]: DEBUG nova.compute.manager [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Refreshing instance network info cache due to event network-changed-d6d14cac-0618-4f2a-b8a3-caa176d3931c. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 948.650539] env[65680]: DEBUG oslo_concurrency.lockutils [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] Acquiring lock "refresh_cache-dd382edd-abe8-4764-a9d5-4144ef7d50b0" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 948.650661] env[65680]: DEBUG oslo_concurrency.lockutils [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] Acquired lock "refresh_cache-dd382edd-abe8-4764-a9d5-4144ef7d50b0" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 948.651187] env[65680]: DEBUG nova.network.neutron [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Refreshing network info cache for port d6d14cac-0618-4f2a-b8a3-caa176d3931c {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 948.768586] env[65680]: DEBUG oslo_vmware.api [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Task: {'id': task-2847939, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076272} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 948.769988] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 948.769988] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 948.770301] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 948.770603] env[65680]: INFO nova.compute.manager [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Took 1.35 seconds to destroy the instance on the hypervisor. [ 948.773385] env[65680]: DEBUG nova.compute.claims [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 948.773625] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 948.774361] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 948.814609] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.040s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 948.817045] env[65680]: DEBUG nova.compute.utils [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Instance cb739449-a329-41b8-964c-8c9db383e846 could not be found. {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 948.818900] env[65680]: DEBUG nova.compute.manager [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Instance disappeared during build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 948.822219] env[65680]: DEBUG nova.compute.manager [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 948.822416] env[65680]: DEBUG nova.compute.manager [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 948.822595] env[65680]: DEBUG nova.compute.manager [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 948.822758] env[65680]: DEBUG nova.network.neutron [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 948.837808] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 948.837976] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 948.838452] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 948.860391] env[65680]: DEBUG nova.network.neutron [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Successfully created port: 47906351-0c3f-4c76-9e2f-d586423efb6e {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 948.880366] env[65680]: DEBUG nova.network.neutron [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Successfully updated port: 01907b62-4b40-4f64-8f92-89a1184281ff {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 948.890296] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Acquiring lock "refresh_cache-c9230f1c-72ea-4f62-be9f-949def49c5f4" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 948.890442] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Acquired lock "refresh_cache-c9230f1c-72ea-4f62-be9f-949def49c5f4" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 948.890592] env[65680]: DEBUG nova.network.neutron [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 948.990774] env[65680]: DEBUG nova.network.neutron [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 949.016434] env[65680]: DEBUG neutronclient.v2_0.client [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=65680) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 949.021579] env[65680]: ERROR nova.compute.manager [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] [instance: cb739449-a329-41b8-964c-8c9db383e846] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 949.021579] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] Traceback (most recent call last): [ 949.021579] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 949.021579] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 949.021579] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 949.021579] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] result = getattr(controller, method)(*args, **kwargs) [ 949.021579] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 949.021579] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return self._get(image_id) [ 949.021579] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 949.021579] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return RequestIdProxy(wrapped(*args, **kwargs)) [ 949.021579] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 949.021579] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] resp, body = self.http_client.get(url, headers=header) [ 949.022166] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 949.022166] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return self.request(url, 'GET', **kwargs) [ 949.022166] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 949.022166] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return self._handle_response(resp) [ 949.022166] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 949.022166] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] raise exc.from_response(resp, resp.content) [ 949.022166] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 949.022166] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.022166] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] During handling of the above exception, another exception occurred: [ 949.022166] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.022166] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] Traceback (most recent call last): [ 949.022166] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 949.022757] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] self.driver.spawn(context, instance, image_meta, [ 949.022757] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 949.022757] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] self._vmops.spawn(context, instance, image_meta, injected_files, [ 949.022757] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 949.022757] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] self._fetch_image_if_missing(context, vi) [ 949.022757] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 949.022757] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] image_fetch(context, vi, tmp_image_ds_loc) [ 949.022757] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 949.022757] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] images.fetch_image( [ 949.022757] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 949.022757] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] metadata = IMAGE_API.get(context, image_ref) [ 949.022757] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 949.022757] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return session.show(context, image_id, [ 949.023307] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 949.023307] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] _reraise_translated_image_exception(image_id) [ 949.023307] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 949.023307] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] raise new_exc.with_traceback(exc_trace) [ 949.023307] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 949.023307] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 949.023307] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 949.023307] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] result = getattr(controller, method)(*args, **kwargs) [ 949.023307] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 949.023307] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return self._get(image_id) [ 949.023307] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 949.023307] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return RequestIdProxy(wrapped(*args, **kwargs)) [ 949.023307] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 949.023592] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] resp, body = self.http_client.get(url, headers=header) [ 949.023592] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 949.023592] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return self.request(url, 'GET', **kwargs) [ 949.023592] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 949.023592] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return self._handle_response(resp) [ 949.023592] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 949.023592] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] raise exc.from_response(resp, resp.content) [ 949.023592] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 949.023592] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.023592] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] During handling of the above exception, another exception occurred: [ 949.023592] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.023592] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] Traceback (most recent call last): [ 949.023592] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 949.023967] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] self._build_and_run_instance(context, instance, image, [ 949.023967] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 949.023967] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] with excutils.save_and_reraise_exception(): [ 949.023967] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 949.023967] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] self.force_reraise() [ 949.023967] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 949.023967] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] raise self.value [ 949.023967] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 949.023967] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] with self.rt.instance_claim(context, instance, node, allocs, [ 949.023967] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 949.023967] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] self.abort() [ 949.023967] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 949.023967] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 949.024298] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 949.024298] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return f(*args, **kwargs) [ 949.024298] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 949.024298] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] self._unset_instance_host_and_node(instance) [ 949.024298] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 949.024298] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] instance.save() [ 949.024298] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 949.024298] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] updates, result = self.indirection_api.object_action( [ 949.024298] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 949.024298] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return cctxt.call(context, 'object_action', objinst=objinst, [ 949.024298] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 949.024298] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] result = self.transport._send( [ 949.024603] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 949.024603] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return self._driver.send(target, ctxt, message, [ 949.024603] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 949.024603] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 949.024603] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 949.024603] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] raise result [ 949.024603] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] nova.exception_Remote.InstanceNotFound_Remote: Instance cb739449-a329-41b8-964c-8c9db383e846 could not be found. [ 949.024603] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] Traceback (most recent call last): [ 949.024603] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.024603] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 949.024603] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return getattr(target, method)(*args, **kwargs) [ 949.024603] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.024603] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 949.025194] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return fn(self, *args, **kwargs) [ 949.025194] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.025194] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 949.025194] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] old_ref, inst_ref = db.instance_update_and_get_original( [ 949.025194] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.025194] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 949.025194] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return f(*args, **kwargs) [ 949.025194] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.025194] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 949.025194] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] with excutils.save_and_reraise_exception() as ectxt: [ 949.025194] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.025194] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 949.025194] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] self.force_reraise() [ 949.025194] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.025194] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 949.025558] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] raise self.value [ 949.025558] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.025558] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 949.025558] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return f(*args, **kwargs) [ 949.025558] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.025558] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 949.025558] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return f(context, *args, **kwargs) [ 949.025558] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.025558] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 949.025558] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 949.025558] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.025558] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 949.025558] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] raise exception.InstanceNotFound(instance_id=uuid) [ 949.025558] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.025558] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] nova.exception.InstanceNotFound: Instance cb739449-a329-41b8-964c-8c9db383e846 could not be found. [ 949.026193] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.026193] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.026193] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] During handling of the above exception, another exception occurred: [ 949.026193] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.026193] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] Traceback (most recent call last): [ 949.026193] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 949.026193] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] ret = obj(*args, **kwargs) [ 949.026193] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 949.026193] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] exception_handler_v20(status_code, error_body) [ 949.026193] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 949.026193] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] raise client_exc(message=error_message, [ 949.026193] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 949.026193] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] Neutron server returns request_ids: ['req-e48ddea6-4656-4821-b816-c7332e0dc36b'] [ 949.026193] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.026519] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] During handling of the above exception, another exception occurred: [ 949.026519] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.026519] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] Traceback (most recent call last): [ 949.026519] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 949.026519] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] self._deallocate_network(context, instance, requested_networks) [ 949.026519] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 949.026519] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] self.network_api.deallocate_for_instance( [ 949.026519] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 949.026519] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] data = neutron.list_ports(**search_opts) [ 949.026519] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 949.026519] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] ret = obj(*args, **kwargs) [ 949.026519] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 949.026519] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return self.list('ports', self.ports_path, retrieve_all, [ 949.026805] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 949.026805] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] ret = obj(*args, **kwargs) [ 949.026805] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 949.026805] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] for r in self._pagination(collection, path, **params): [ 949.026805] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 949.026805] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] res = self.get(path, params=params) [ 949.026805] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 949.026805] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] ret = obj(*args, **kwargs) [ 949.026805] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 949.026805] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return self.retry_request("GET", action, body=body, [ 949.026805] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 949.026805] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] ret = obj(*args, **kwargs) [ 949.026805] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 949.027156] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] return self.do_request(method, action, body=body, [ 949.027156] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 949.027156] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] ret = obj(*args, **kwargs) [ 949.027156] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 949.027156] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] self._handle_fault_response(status_code, replybody, resp) [ 949.027156] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 949.027156] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] raise exception.Unauthorized() [ 949.027156] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] nova.exception.Unauthorized: Not authorized. [ 949.027156] env[65680]: ERROR nova.compute.manager [instance: cb739449-a329-41b8-964c-8c9db383e846] [ 949.041032] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847942, 'name': CreateVM_Task, 'duration_secs': 0.311593} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 949.041826] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 949.047617] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 949.047759] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 949.048099] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 949.048584] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2f6e402b-ff01-4956-aab7-61d362359cf2 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.051951] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7f48bfb6-75e6-4865-937e-db31dba7e85d tempest-ServerActionsTestOtherB-1498672476 tempest-ServerActionsTestOtherB-1498672476-project-member] Lock "cb739449-a329-41b8-964c-8c9db383e846" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 312.581s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 949.053834] env[65680]: DEBUG oslo_vmware.api [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Waiting for the task: (returnval){ [ 949.053834] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]526606f6-46f5-be1f-937c-36e7ccca40f0" [ 949.053834] env[65680]: _type = "Task" [ 949.053834] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 949.062173] env[65680]: DEBUG oslo_vmware.api [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]526606f6-46f5-be1f-937c-36e7ccca40f0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 949.127297] env[65680]: DEBUG nova.network.neutron [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Successfully updated port: 1dae8dd6-14d6-4b7e-b7b6-e9caf63340f9 {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 949.136435] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Acquiring lock "refresh_cache-01e82211-1de5-44ad-b14e-81a54470d4e5" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 949.136570] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Acquired lock "refresh_cache-01e82211-1de5-44ad-b14e-81a54470d4e5" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 949.136713] env[65680]: DEBUG nova.network.neutron [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 949.185144] env[65680]: DEBUG nova.network.neutron [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Updating instance_info_cache with network_info: [{"id": "01907b62-4b40-4f64-8f92-89a1184281ff", "address": "fa:16:3e:ed:dd:71", "network": {"id": "61301e08-7106-4baf-9576-2af862c214d8", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-926929756-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a407e2bf97c14a7f88a7e8229e894d7a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24727047-6358-4015-86c1-394ab07fb88f", "external-id": "nsx-vlan-transportzone-476", "segmentation_id": 476, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap01907b62-4b", "ovs_interfaceid": "01907b62-4b40-4f64-8f92-89a1184281ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 949.195604] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Releasing lock "refresh_cache-c9230f1c-72ea-4f62-be9f-949def49c5f4" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 949.195871] env[65680]: DEBUG nova.compute.manager [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Instance network_info: |[{"id": "01907b62-4b40-4f64-8f92-89a1184281ff", "address": "fa:16:3e:ed:dd:71", "network": {"id": "61301e08-7106-4baf-9576-2af862c214d8", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-926929756-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a407e2bf97c14a7f88a7e8229e894d7a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24727047-6358-4015-86c1-394ab07fb88f", "external-id": "nsx-vlan-transportzone-476", "segmentation_id": 476, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap01907b62-4b", "ovs_interfaceid": "01907b62-4b40-4f64-8f92-89a1184281ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 949.196254] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ed:dd:71', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '24727047-6358-4015-86c1-394ab07fb88f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '01907b62-4b40-4f64-8f92-89a1184281ff', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 949.203618] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Creating folder: Project (a407e2bf97c14a7f88a7e8229e894d7a). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 949.204102] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1633bc20-abed-4c5c-8f67-44af5f75a619 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.216715] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Created folder: Project (a407e2bf97c14a7f88a7e8229e894d7a) in parent group-v572532. [ 949.216895] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Creating folder: Instances. Parent ref: group-v572601. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 949.217271] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f41dcfa8-569d-40bd-a9e2-a6400edda4fa {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.225013] env[65680]: DEBUG nova.network.neutron [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 949.228437] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Created folder: Instances in parent group-v572601. [ 949.228684] env[65680]: DEBUG oslo.service.loopingcall [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 949.228828] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 949.229030] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-86292d0f-be9e-4d94-bef4-ed90f72abe45 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.247327] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 949.247327] env[65680]: value = "task-2847945" [ 949.247327] env[65680]: _type = "Task" [ 949.247327] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 949.256033] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847945, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 949.382810] env[65680]: DEBUG nova.network.neutron [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Updated VIF entry in instance network info cache for port d6d14cac-0618-4f2a-b8a3-caa176d3931c. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 949.383280] env[65680]: DEBUG nova.network.neutron [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Updating instance_info_cache with network_info: [{"id": "d6d14cac-0618-4f2a-b8a3-caa176d3931c", "address": "fa:16:3e:9b:f1:56", "network": {"id": "720dcf80-b0c2-4e54-948f-b3cfe1e047c4", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-8048602-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1624dcd5d40b4484bc8a806cdcb8c090", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b7d09e9-a3dd-4d89-b9dd-2814f5f6dd5d", "external-id": "nsx-vlan-transportzone-591", "segmentation_id": 591, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd6d14cac-06", "ovs_interfaceid": "d6d14cac-0618-4f2a-b8a3-caa176d3931c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 949.393031] env[65680]: DEBUG oslo_concurrency.lockutils [req-014aef7c-a2cb-493f-85d3-4eba4f4d3c36 req-1d7e1e6d-6c17-4b13-8ce9-42592cbd1654 service nova] Releasing lock "refresh_cache-dd382edd-abe8-4764-a9d5-4144ef7d50b0" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 949.564567] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 949.564827] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 949.565054] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 949.566952] env[65680]: DEBUG nova.network.neutron [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Updating instance_info_cache with network_info: [{"id": "1dae8dd6-14d6-4b7e-b7b6-e9caf63340f9", "address": "fa:16:3e:91:07:38", "network": {"id": "51e431d8-d398-4406-9103-4b339ab2e1ea", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1099890668-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bb30ad84896d480885a85e4621656c6a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fe38bb7e-8bcb-419d-868f-0dc105c69651", "external-id": "nsx-vlan-transportzone-432", "segmentation_id": 432, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1dae8dd6-14", "ovs_interfaceid": "1dae8dd6-14d6-4b7e-b7b6-e9caf63340f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 949.579299] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Releasing lock "refresh_cache-01e82211-1de5-44ad-b14e-81a54470d4e5" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 949.579565] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Instance network_info: |[{"id": "1dae8dd6-14d6-4b7e-b7b6-e9caf63340f9", "address": "fa:16:3e:91:07:38", "network": {"id": "51e431d8-d398-4406-9103-4b339ab2e1ea", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1099890668-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bb30ad84896d480885a85e4621656c6a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fe38bb7e-8bcb-419d-868f-0dc105c69651", "external-id": "nsx-vlan-transportzone-432", "segmentation_id": 432, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1dae8dd6-14", "ovs_interfaceid": "1dae8dd6-14d6-4b7e-b7b6-e9caf63340f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 949.579905] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:91:07:38', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'fe38bb7e-8bcb-419d-868f-0dc105c69651', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1dae8dd6-14d6-4b7e-b7b6-e9caf63340f9', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 949.587411] env[65680]: DEBUG oslo.service.loopingcall [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 949.587826] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 949.588059] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b19c0c20-c54c-4da6-8f21-3bdaf53a0186 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.608937] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 949.608937] env[65680]: value = "task-2847946" [ 949.608937] env[65680]: _type = "Task" [ 949.608937] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 949.616366] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847946, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 949.703465] env[65680]: DEBUG nova.compute.manager [req-70664011-1cec-4776-bd6e-2279bfdd7572 req-b5940a8e-be9c-410f-8222-ae91a21b0602 service nova] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Received event network-vif-plugged-908eef60-29d5-4d72-9c39-6c2782adcb09 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 949.703750] env[65680]: DEBUG oslo_concurrency.lockutils [req-70664011-1cec-4776-bd6e-2279bfdd7572 req-b5940a8e-be9c-410f-8222-ae91a21b0602 service nova] Acquiring lock "8b747838-fcd0-494c-bd5a-0e5b1950a44e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 949.704106] env[65680]: DEBUG oslo_concurrency.lockutils [req-70664011-1cec-4776-bd6e-2279bfdd7572 req-b5940a8e-be9c-410f-8222-ae91a21b0602 service nova] Lock "8b747838-fcd0-494c-bd5a-0e5b1950a44e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 949.704400] env[65680]: DEBUG oslo_concurrency.lockutils [req-70664011-1cec-4776-bd6e-2279bfdd7572 req-b5940a8e-be9c-410f-8222-ae91a21b0602 service nova] Lock "8b747838-fcd0-494c-bd5a-0e5b1950a44e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 949.704607] env[65680]: DEBUG nova.compute.manager [req-70664011-1cec-4776-bd6e-2279bfdd7572 req-b5940a8e-be9c-410f-8222-ae91a21b0602 service nova] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] No waiting events found dispatching network-vif-plugged-908eef60-29d5-4d72-9c39-6c2782adcb09 {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 949.704764] env[65680]: WARNING nova.compute.manager [req-70664011-1cec-4776-bd6e-2279bfdd7572 req-b5940a8e-be9c-410f-8222-ae91a21b0602 service nova] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Received unexpected event network-vif-plugged-908eef60-29d5-4d72-9c39-6c2782adcb09 for instance with vm_state building and task_state spawning. [ 949.704925] env[65680]: DEBUG nova.compute.manager [req-70664011-1cec-4776-bd6e-2279bfdd7572 req-b5940a8e-be9c-410f-8222-ae91a21b0602 service nova] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Received event network-changed-908eef60-29d5-4d72-9c39-6c2782adcb09 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 949.705105] env[65680]: DEBUG nova.compute.manager [req-70664011-1cec-4776-bd6e-2279bfdd7572 req-b5940a8e-be9c-410f-8222-ae91a21b0602 service nova] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Refreshing instance network info cache due to event network-changed-908eef60-29d5-4d72-9c39-6c2782adcb09. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 949.705309] env[65680]: DEBUG oslo_concurrency.lockutils [req-70664011-1cec-4776-bd6e-2279bfdd7572 req-b5940a8e-be9c-410f-8222-ae91a21b0602 service nova] Acquiring lock "refresh_cache-8b747838-fcd0-494c-bd5a-0e5b1950a44e" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 949.705456] env[65680]: DEBUG oslo_concurrency.lockutils [req-70664011-1cec-4776-bd6e-2279bfdd7572 req-b5940a8e-be9c-410f-8222-ae91a21b0602 service nova] Acquired lock "refresh_cache-8b747838-fcd0-494c-bd5a-0e5b1950a44e" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 949.705613] env[65680]: DEBUG nova.network.neutron [req-70664011-1cec-4776-bd6e-2279bfdd7572 req-b5940a8e-be9c-410f-8222-ae91a21b0602 service nova] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Refreshing network info cache for port 908eef60-29d5-4d72-9c39-6c2782adcb09 {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 949.756681] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847945, 'name': CreateVM_Task, 'duration_secs': 0.286859} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 949.756849] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 949.757554] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 949.757664] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 949.757979] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 949.760922] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-136a9b8c-e640-4d9e-8270-882a921d4e18 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.767071] env[65680]: DEBUG oslo_vmware.api [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Waiting for the task: (returnval){ [ 949.767071] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52c9eb0d-9530-6086-e2c7-cfa0efcbbcd0" [ 949.767071] env[65680]: _type = "Task" [ 949.767071] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 949.778611] env[65680]: DEBUG oslo_vmware.api [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52c9eb0d-9530-6086-e2c7-cfa0efcbbcd0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 950.118424] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847946, 'name': CreateVM_Task, 'duration_secs': 0.315951} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 950.118763] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 950.119426] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 950.275681] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 950.275939] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 950.276174] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 950.276378] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 950.276672] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 950.276909] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-250e9920-e786-4a65-9738-3d3b2f7d1221 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 950.281351] env[65680]: DEBUG oslo_vmware.api [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Waiting for the task: (returnval){ [ 950.281351] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]5249df07-06b3-da71-a3c6-c2814d5bb38e" [ 950.281351] env[65680]: _type = "Task" [ 950.281351] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 950.290948] env[65680]: DEBUG oslo_vmware.api [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]5249df07-06b3-da71-a3c6-c2814d5bb38e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 950.301531] env[65680]: DEBUG nova.network.neutron [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Successfully updated port: 47906351-0c3f-4c76-9e2f-d586423efb6e {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 950.311410] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Acquiring lock "refresh_cache-132e6039-55dc-4118-bcd5-d32557743981" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 950.311561] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Acquired lock "refresh_cache-132e6039-55dc-4118-bcd5-d32557743981" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 950.311720] env[65680]: DEBUG nova.network.neutron [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 950.381312] env[65680]: DEBUG nova.network.neutron [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 950.411315] env[65680]: DEBUG nova.network.neutron [req-70664011-1cec-4776-bd6e-2279bfdd7572 req-b5940a8e-be9c-410f-8222-ae91a21b0602 service nova] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Updated VIF entry in instance network info cache for port 908eef60-29d5-4d72-9c39-6c2782adcb09. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 950.411994] env[65680]: DEBUG nova.network.neutron [req-70664011-1cec-4776-bd6e-2279bfdd7572 req-b5940a8e-be9c-410f-8222-ae91a21b0602 service nova] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Updating instance_info_cache with network_info: [{"id": "908eef60-29d5-4d72-9c39-6c2782adcb09", "address": "fa:16:3e:67:2e:de", "network": {"id": "51e431d8-d398-4406-9103-4b339ab2e1ea", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1099890668-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bb30ad84896d480885a85e4621656c6a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fe38bb7e-8bcb-419d-868f-0dc105c69651", "external-id": "nsx-vlan-transportzone-432", "segmentation_id": 432, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap908eef60-29", "ovs_interfaceid": "908eef60-29d5-4d72-9c39-6c2782adcb09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 950.420652] env[65680]: DEBUG oslo_concurrency.lockutils [req-70664011-1cec-4776-bd6e-2279bfdd7572 req-b5940a8e-be9c-410f-8222-ae91a21b0602 service nova] Releasing lock "refresh_cache-8b747838-fcd0-494c-bd5a-0e5b1950a44e" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 950.525133] env[65680]: DEBUG nova.compute.manager [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Received event network-vif-plugged-1dae8dd6-14d6-4b7e-b7b6-e9caf63340f9 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 950.525927] env[65680]: DEBUG oslo_concurrency.lockutils [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] Acquiring lock "01e82211-1de5-44ad-b14e-81a54470d4e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 950.525927] env[65680]: DEBUG oslo_concurrency.lockutils [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] Lock "01e82211-1de5-44ad-b14e-81a54470d4e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 950.525927] env[65680]: DEBUG oslo_concurrency.lockutils [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] Lock "01e82211-1de5-44ad-b14e-81a54470d4e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 950.525927] env[65680]: DEBUG nova.compute.manager [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] No waiting events found dispatching network-vif-plugged-1dae8dd6-14d6-4b7e-b7b6-e9caf63340f9 {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 950.526213] env[65680]: WARNING nova.compute.manager [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Received unexpected event network-vif-plugged-1dae8dd6-14d6-4b7e-b7b6-e9caf63340f9 for instance with vm_state building and task_state spawning. [ 950.526213] env[65680]: DEBUG nova.compute.manager [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Received event network-changed-01907b62-4b40-4f64-8f92-89a1184281ff {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 950.526305] env[65680]: DEBUG nova.compute.manager [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Refreshing instance network info cache due to event network-changed-01907b62-4b40-4f64-8f92-89a1184281ff. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 950.526554] env[65680]: DEBUG oslo_concurrency.lockutils [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] Acquiring lock "refresh_cache-c9230f1c-72ea-4f62-be9f-949def49c5f4" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 950.526625] env[65680]: DEBUG oslo_concurrency.lockutils [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] Acquired lock "refresh_cache-c9230f1c-72ea-4f62-be9f-949def49c5f4" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 950.526734] env[65680]: DEBUG nova.network.neutron [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Refreshing network info cache for port 01907b62-4b40-4f64-8f92-89a1184281ff {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 950.567803] env[65680]: DEBUG nova.network.neutron [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Updating instance_info_cache with network_info: [{"id": "47906351-0c3f-4c76-9e2f-d586423efb6e", "address": "fa:16:3e:c3:92:7e", "network": {"id": "31247135-cd35-4ff0-9c62-ebcaba71e8b8", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-132621584-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8004b2d89fa641f8881699a2300ecf4f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1002b79b-224e-41e3-a484-4245a767147a", "external-id": "nsx-vlan-transportzone-353", "segmentation_id": 353, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap47906351-0c", "ovs_interfaceid": "47906351-0c3f-4c76-9e2f-d586423efb6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 950.577734] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Releasing lock "refresh_cache-132e6039-55dc-4118-bcd5-d32557743981" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 950.578019] env[65680]: DEBUG nova.compute.manager [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Instance network_info: |[{"id": "47906351-0c3f-4c76-9e2f-d586423efb6e", "address": "fa:16:3e:c3:92:7e", "network": {"id": "31247135-cd35-4ff0-9c62-ebcaba71e8b8", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-132621584-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8004b2d89fa641f8881699a2300ecf4f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1002b79b-224e-41e3-a484-4245a767147a", "external-id": "nsx-vlan-transportzone-353", "segmentation_id": 353, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap47906351-0c", "ovs_interfaceid": "47906351-0c3f-4c76-9e2f-d586423efb6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 950.578390] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c3:92:7e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1002b79b-224e-41e3-a484-4245a767147a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '47906351-0c3f-4c76-9e2f-d586423efb6e', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 950.586130] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Creating folder: Project (8004b2d89fa641f8881699a2300ecf4f). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 950.586570] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4f843d71-95e5-4f67-a7d0-d9557f00581f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 950.596777] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Created folder: Project (8004b2d89fa641f8881699a2300ecf4f) in parent group-v572532. [ 950.596948] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Creating folder: Instances. Parent ref: group-v572605. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 950.597175] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-508473b5-7b50-401c-9c1f-e65f38693f35 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 950.608219] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Created folder: Instances in parent group-v572605. [ 950.608446] env[65680]: DEBUG oslo.service.loopingcall [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 950.608667] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 950.608784] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0b69fc51-9c2b-450a-87e8-ddae960066bc {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 950.630030] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 950.630030] env[65680]: value = "task-2847949" [ 950.630030] env[65680]: _type = "Task" [ 950.630030] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 950.637090] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847949, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 950.792234] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 950.792492] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 950.792705] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 950.841648] env[65680]: DEBUG nova.network.neutron [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Updated VIF entry in instance network info cache for port 01907b62-4b40-4f64-8f92-89a1184281ff. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 950.841990] env[65680]: DEBUG nova.network.neutron [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Updating instance_info_cache with network_info: [{"id": "01907b62-4b40-4f64-8f92-89a1184281ff", "address": "fa:16:3e:ed:dd:71", "network": {"id": "61301e08-7106-4baf-9576-2af862c214d8", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-926929756-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "a407e2bf97c14a7f88a7e8229e894d7a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24727047-6358-4015-86c1-394ab07fb88f", "external-id": "nsx-vlan-transportzone-476", "segmentation_id": 476, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap01907b62-4b", "ovs_interfaceid": "01907b62-4b40-4f64-8f92-89a1184281ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 950.850681] env[65680]: DEBUG oslo_concurrency.lockutils [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] Releasing lock "refresh_cache-c9230f1c-72ea-4f62-be9f-949def49c5f4" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 950.850916] env[65680]: DEBUG nova.compute.manager [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Received event network-changed-1dae8dd6-14d6-4b7e-b7b6-e9caf63340f9 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 950.851097] env[65680]: DEBUG nova.compute.manager [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Refreshing instance network info cache due to event network-changed-1dae8dd6-14d6-4b7e-b7b6-e9caf63340f9. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 950.851305] env[65680]: DEBUG oslo_concurrency.lockutils [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] Acquiring lock "refresh_cache-01e82211-1de5-44ad-b14e-81a54470d4e5" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 950.851442] env[65680]: DEBUG oslo_concurrency.lockutils [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] Acquired lock "refresh_cache-01e82211-1de5-44ad-b14e-81a54470d4e5" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 950.851609] env[65680]: DEBUG nova.network.neutron [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Refreshing network info cache for port 1dae8dd6-14d6-4b7e-b7b6-e9caf63340f9 {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 951.140384] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847949, 'name': CreateVM_Task, 'duration_secs': 0.278979} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 951.140889] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 951.141279] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 951.141440] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 951.141810] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 951.142075] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-42e67e60-79b8-4ba4-9236-63dfb7ecebcd {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 951.146330] env[65680]: DEBUG nova.network.neutron [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Updated VIF entry in instance network info cache for port 1dae8dd6-14d6-4b7e-b7b6-e9caf63340f9. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 951.146650] env[65680]: DEBUG nova.network.neutron [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Updating instance_info_cache with network_info: [{"id": "1dae8dd6-14d6-4b7e-b7b6-e9caf63340f9", "address": "fa:16:3e:91:07:38", "network": {"id": "51e431d8-d398-4406-9103-4b339ab2e1ea", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1099890668-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bb30ad84896d480885a85e4621656c6a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fe38bb7e-8bcb-419d-868f-0dc105c69651", "external-id": "nsx-vlan-transportzone-432", "segmentation_id": 432, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1dae8dd6-14", "ovs_interfaceid": "1dae8dd6-14d6-4b7e-b7b6-e9caf63340f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 951.148781] env[65680]: DEBUG oslo_vmware.api [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Waiting for the task: (returnval){ [ 951.148781] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]5206ae77-63c5-d6b9-9421-d18d491c759b" [ 951.148781] env[65680]: _type = "Task" [ 951.148781] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 951.158541] env[65680]: DEBUG oslo_vmware.api [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]5206ae77-63c5-d6b9-9421-d18d491c759b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 951.159462] env[65680]: DEBUG oslo_concurrency.lockutils [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] Releasing lock "refresh_cache-01e82211-1de5-44ad-b14e-81a54470d4e5" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 951.159686] env[65680]: DEBUG nova.compute.manager [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Received event network-vif-plugged-47906351-0c3f-4c76-9e2f-d586423efb6e {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 951.160049] env[65680]: DEBUG oslo_concurrency.lockutils [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] Acquiring lock "132e6039-55dc-4118-bcd5-d32557743981-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 951.160268] env[65680]: DEBUG oslo_concurrency.lockutils [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] Lock "132e6039-55dc-4118-bcd5-d32557743981-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 951.160423] env[65680]: DEBUG oslo_concurrency.lockutils [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] Lock "132e6039-55dc-4118-bcd5-d32557743981-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 951.160586] env[65680]: DEBUG nova.compute.manager [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] [instance: 132e6039-55dc-4118-bcd5-d32557743981] No waiting events found dispatching network-vif-plugged-47906351-0c3f-4c76-9e2f-d586423efb6e {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 951.160742] env[65680]: WARNING nova.compute.manager [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Received unexpected event network-vif-plugged-47906351-0c3f-4c76-9e2f-d586423efb6e for instance with vm_state building and task_state spawning. [ 951.160901] env[65680]: DEBUG nova.compute.manager [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Received event network-changed-47906351-0c3f-4c76-9e2f-d586423efb6e {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 951.161076] env[65680]: DEBUG nova.compute.manager [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Refreshing instance network info cache due to event network-changed-47906351-0c3f-4c76-9e2f-d586423efb6e. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 951.161238] env[65680]: DEBUG oslo_concurrency.lockutils [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] Acquiring lock "refresh_cache-132e6039-55dc-4118-bcd5-d32557743981" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 951.161354] env[65680]: DEBUG oslo_concurrency.lockutils [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] Acquired lock "refresh_cache-132e6039-55dc-4118-bcd5-d32557743981" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 951.161501] env[65680]: DEBUG nova.network.neutron [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Refreshing network info cache for port 47906351-0c3f-4c76-9e2f-d586423efb6e {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 951.420391] env[65680]: DEBUG nova.network.neutron [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Updated VIF entry in instance network info cache for port 47906351-0c3f-4c76-9e2f-d586423efb6e. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 951.420749] env[65680]: DEBUG nova.network.neutron [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Updating instance_info_cache with network_info: [{"id": "47906351-0c3f-4c76-9e2f-d586423efb6e", "address": "fa:16:3e:c3:92:7e", "network": {"id": "31247135-cd35-4ff0-9c62-ebcaba71e8b8", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-132621584-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8004b2d89fa641f8881699a2300ecf4f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1002b79b-224e-41e3-a484-4245a767147a", "external-id": "nsx-vlan-transportzone-353", "segmentation_id": 353, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap47906351-0c", "ovs_interfaceid": "47906351-0c3f-4c76-9e2f-d586423efb6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 951.429802] env[65680]: DEBUG oslo_concurrency.lockutils [req-8f043780-2ad9-4a8c-a19f-ce802b4cd2f0 req-21f3e9a6-f0e9-4ae4-a4c3-864247df8161 service nova] Releasing lock "refresh_cache-132e6039-55dc-4118-bcd5-d32557743981" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 951.659520] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 951.659775] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 951.660015] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 985.294462] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 985.294462] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Starting heal instance info cache {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 985.294462] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Rebuilding the list of instances to heal {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 985.312409] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 985.312575] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 985.312710] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 985.312833] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 985.312957] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 985.313258] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 985.313465] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 985.313597] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 985.313720] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Didn't find any instances for network info cache update. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 985.314225] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 986.309078] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 988.293179] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 988.293501] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65680) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 990.293700] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 990.294103] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 990.294146] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 991.293104] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 991.293353] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager.update_available_resource {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 991.303776] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 991.304062] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 991.304178] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 991.304333] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65680) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 991.305446] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70e41320-3819-400c-bd41-2aa904d552ed {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 991.314189] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f383ce7-28e6-414c-aba1-1036783e6b0c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 991.327975] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f51ce812-8908-41f2-98ab-0356903343d6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 991.334528] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5465078b-b487-40f3-a5a5-5f391ac4f883 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 991.364870] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180990MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65680) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 991.365040] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 991.365228] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 991.421505] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance e5d6d263-463e-46b8-9bb3-d10a4101d4e0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 991.421666] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance abb69e61-9594-48b5-b3f4-f8ba39f93f0e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 991.421796] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 05ef6eca-eb64-43b3-8c7d-b5a230282a8f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 991.421919] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 8b747838-fcd0-494c-bd5a-0e5b1950a44e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 991.422049] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 01e82211-1de5-44ad-b14e-81a54470d4e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 991.422172] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance dd382edd-abe8-4764-a9d5-4144ef7d50b0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 991.422287] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance c9230f1c-72ea-4f62-be9f-949def49c5f4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 991.422403] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 132e6039-55dc-4118-bcd5-d32557743981 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 991.422631] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 991.422723] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 991.516697] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d63ff2c-3f00-4347-b686-0f8c7a97e6cb {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 991.524398] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fa632d4-626e-47b8-bcc1-ed65a9de735b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 991.553246] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36a9d192-af2c-45e7-8411-61670c801bbe {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 991.559936] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-146f2b68-1e95-4564-ba8d-a889ae987809 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 991.572615] env[65680]: DEBUG nova.compute.provider_tree [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 991.580692] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 991.592876] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65680) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 991.593085] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.228s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 994.920633] env[65680]: WARNING oslo_vmware.rw_handles [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 994.920633] env[65680]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 994.920633] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 994.920633] env[65680]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 994.920633] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 994.920633] env[65680]: ERROR oslo_vmware.rw_handles response.begin() [ 994.920633] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 994.920633] env[65680]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 994.920633] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 994.920633] env[65680]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 994.920633] env[65680]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 994.920633] env[65680]: ERROR oslo_vmware.rw_handles [ 994.921455] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Downloaded image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to vmware_temp/7a8853e1-4906-4f28-bd7f-2ca99183912a/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 994.922916] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Caching image {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 994.923184] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Copying Virtual Disk [datastore1] vmware_temp/7a8853e1-4906-4f28-bd7f-2ca99183912a/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk to [datastore1] vmware_temp/7a8853e1-4906-4f28-bd7f-2ca99183912a/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk {{(pid=65680) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 994.923494] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5b95419b-13cc-4118-bfbe-9e093edec8e0 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 994.933870] env[65680]: DEBUG oslo_vmware.api [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Waiting for the task: (returnval){ [ 994.933870] env[65680]: value = "task-2847950" [ 994.933870] env[65680]: _type = "Task" [ 994.933870] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 994.942235] env[65680]: DEBUG oslo_vmware.api [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Task: {'id': task-2847950, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 995.444263] env[65680]: DEBUG oslo_vmware.exceptions [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Fault InvalidArgument not matched. {{(pid=65680) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 995.444561] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 995.445161] env[65680]: ERROR nova.compute.manager [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 995.445161] env[65680]: Faults: ['InvalidArgument'] [ 995.445161] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Traceback (most recent call last): [ 995.445161] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 995.445161] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] yield resources [ 995.445161] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 995.445161] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] self.driver.spawn(context, instance, image_meta, [ 995.445161] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 995.445161] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 995.445161] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 995.445161] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] self._fetch_image_if_missing(context, vi) [ 995.445161] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 995.445479] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] image_cache(vi, tmp_image_ds_loc) [ 995.445479] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 995.445479] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] vm_util.copy_virtual_disk( [ 995.445479] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 995.445479] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] session._wait_for_task(vmdk_copy_task) [ 995.445479] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 995.445479] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] return self.wait_for_task(task_ref) [ 995.445479] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 995.445479] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] return evt.wait() [ 995.445479] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 995.445479] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] result = hub.switch() [ 995.445479] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 995.445479] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] return self.greenlet.switch() [ 995.445913] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 995.445913] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] self.f(*self.args, **self.kw) [ 995.445913] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 995.445913] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] raise exceptions.translate_fault(task_info.error) [ 995.445913] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 995.445913] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Faults: ['InvalidArgument'] [ 995.445913] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] [ 995.445913] env[65680]: INFO nova.compute.manager [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Terminating instance [ 995.446991] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 995.447202] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 995.447424] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-354c10a5-c9c3-431c-b250-f648b2fce766 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.449689] env[65680]: DEBUG nova.compute.manager [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 995.449880] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 995.450603] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0236b3f2-3369-423b-9734-681c1d80c407 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.457716] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 995.457941] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-fa4ce356-c445-49a4-aa01-959127040b3f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.460139] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 995.460309] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 995.461246] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4be2cbcd-bd06-472a-b1b6-c05a8849a82d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.465769] env[65680]: DEBUG oslo_vmware.api [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Waiting for the task: (returnval){ [ 995.465769] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52f90865-91aa-2700-467e-d9f154345697" [ 995.465769] env[65680]: _type = "Task" [ 995.465769] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 995.473128] env[65680]: DEBUG oslo_vmware.api [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52f90865-91aa-2700-467e-d9f154345697, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 995.532410] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 995.532562] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 995.532798] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Deleting the datastore file [datastore1] e5d6d263-463e-46b8-9bb3-d10a4101d4e0 {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 995.533083] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-84d294ca-527a-4a28-be00-d06b65e31eae {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.539287] env[65680]: DEBUG oslo_vmware.api [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Waiting for the task: (returnval){ [ 995.539287] env[65680]: value = "task-2847952" [ 995.539287] env[65680]: _type = "Task" [ 995.539287] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 995.547462] env[65680]: DEBUG oslo_vmware.api [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Task: {'id': task-2847952, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 995.588315] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 995.976987] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 995.976987] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Creating directory with path [datastore1] vmware_temp/8f7013db-efdd-4d57-afaf-f13aede9bcb0/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 995.977559] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5f1c5d91-eb5f-40c3-ba4a-0e40bf35e24f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.988068] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Created directory with path [datastore1] vmware_temp/8f7013db-efdd-4d57-afaf-f13aede9bcb0/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 995.988251] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Fetch image to [datastore1] vmware_temp/8f7013db-efdd-4d57-afaf-f13aede9bcb0/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 995.988417] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/8f7013db-efdd-4d57-afaf-f13aede9bcb0/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 995.989115] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69f2cfe9-9432-4dfc-be03-124fa1928384 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.995403] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47e5747b-3637-4127-8ffa-b0f6f83db89e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.004074] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67dafa38-5622-424e-a783-af1b33054866 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.034097] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c12bea4-7cb6-43c4-a5df-30360227a883 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.039242] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-59cd05a7-87d4-438b-8bf2-34bac896b23f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.047607] env[65680]: DEBUG oslo_vmware.api [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Task: {'id': task-2847952, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071154} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 996.047820] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 996.047990] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 996.048175] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 996.048346] env[65680]: INFO nova.compute.manager [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Took 0.60 seconds to destroy the instance on the hypervisor. [ 996.050426] env[65680]: DEBUG nova.compute.claims [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 996.050687] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 996.051035] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 996.064325] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 996.110703] env[65680]: DEBUG oslo_vmware.rw_handles [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8f7013db-efdd-4d57-afaf-f13aede9bcb0/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 996.169706] env[65680]: DEBUG oslo_vmware.rw_handles [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Completed reading data from the image iterator. {{(pid=65680) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 996.169897] env[65680]: DEBUG oslo_vmware.rw_handles [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8f7013db-efdd-4d57-afaf-f13aede9bcb0/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 996.242185] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9a1b9a8-3a1e-4d5f-8c44-a6ba5dcc4623 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.249708] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce669992-7bab-440c-8f77-6ed49a5bf4af {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.279323] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c2cb1b7-3309-4c33-90ca-844ccb732558 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.286350] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-764fb694-bd7e-4100-9e83-3cb7efaa5c83 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.298687] env[65680]: DEBUG nova.compute.provider_tree [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 996.306696] env[65680]: DEBUG nova.scheduler.client.report [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 996.324049] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.273s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 996.324573] env[65680]: ERROR nova.compute.manager [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 996.324573] env[65680]: Faults: ['InvalidArgument'] [ 996.324573] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Traceback (most recent call last): [ 996.324573] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 996.324573] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] self.driver.spawn(context, instance, image_meta, [ 996.324573] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 996.324573] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 996.324573] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 996.324573] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] self._fetch_image_if_missing(context, vi) [ 996.324573] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 996.324573] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] image_cache(vi, tmp_image_ds_loc) [ 996.324573] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 996.324979] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] vm_util.copy_virtual_disk( [ 996.324979] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 996.324979] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] session._wait_for_task(vmdk_copy_task) [ 996.324979] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 996.324979] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] return self.wait_for_task(task_ref) [ 996.324979] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 996.324979] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] return evt.wait() [ 996.324979] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 996.324979] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] result = hub.switch() [ 996.324979] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 996.324979] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] return self.greenlet.switch() [ 996.324979] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 996.324979] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] self.f(*self.args, **self.kw) [ 996.325351] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 996.325351] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] raise exceptions.translate_fault(task_info.error) [ 996.325351] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 996.325351] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Faults: ['InvalidArgument'] [ 996.325351] env[65680]: ERROR nova.compute.manager [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] [ 996.325351] env[65680]: DEBUG nova.compute.utils [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] VimFaultException {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 996.326821] env[65680]: DEBUG nova.compute.manager [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Build of instance e5d6d263-463e-46b8-9bb3-d10a4101d4e0 was re-scheduled: A specified parameter was not correct: fileType [ 996.326821] env[65680]: Faults: ['InvalidArgument'] {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 996.327194] env[65680]: DEBUG nova.compute.manager [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 996.327364] env[65680]: DEBUG nova.compute.manager [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 996.327533] env[65680]: DEBUG nova.compute.manager [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 996.327691] env[65680]: DEBUG nova.network.neutron [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 996.649562] env[65680]: DEBUG nova.network.neutron [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 996.663021] env[65680]: INFO nova.compute.manager [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] [instance: e5d6d263-463e-46b8-9bb3-d10a4101d4e0] Took 0.33 seconds to deallocate network for instance. [ 996.750816] env[65680]: INFO nova.scheduler.client.report [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Deleted allocations for instance e5d6d263-463e-46b8-9bb3-d10a4101d4e0 [ 996.773657] env[65680]: DEBUG oslo_concurrency.lockutils [None req-7909227e-e606-4f78-bdfb-e41f3c4ec600 tempest-AttachInterfacesUnderV243Test-854427059 tempest-AttachInterfacesUnderV243Test-854427059-project-member] Lock "e5d6d263-463e-46b8-9bb3-d10a4101d4e0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 189.440s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1003.796891] env[65680]: DEBUG oslo_concurrency.lockutils [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Acquiring lock "c6953476-8f7a-4314-a88e-cd5d02c3309f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1003.797201] env[65680]: DEBUG oslo_concurrency.lockutils [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Lock "c6953476-8f7a-4314-a88e-cd5d02c3309f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1003.806446] env[65680]: DEBUG nova.compute.manager [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1003.852244] env[65680]: DEBUG oslo_concurrency.lockutils [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1003.852489] env[65680]: DEBUG oslo_concurrency.lockutils [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1003.854038] env[65680]: INFO nova.compute.claims [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1004.011447] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-464ed6d9-8ce6-4c53-a7f3-30331a3de9e5 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.019306] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-700de6ab-745f-4be1-aeed-2fb401e65b07 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.048406] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-801b2d20-b5a9-4532-9227-59a7e00ae711 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.056090] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e0ee9c3-c6b9-4cce-9f0a-9ff03b4c8a6d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.070384] env[65680]: DEBUG nova.compute.provider_tree [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1004.079179] env[65680]: DEBUG nova.scheduler.client.report [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1004.094093] env[65680]: DEBUG oslo_concurrency.lockutils [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.242s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1004.094564] env[65680]: DEBUG nova.compute.manager [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1004.126399] env[65680]: DEBUG nova.compute.utils [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1004.127722] env[65680]: DEBUG nova.compute.manager [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1004.127899] env[65680]: DEBUG nova.network.neutron [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1004.136643] env[65680]: DEBUG nova.compute.manager [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1004.211578] env[65680]: DEBUG nova.compute.manager [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1004.231058] env[65680]: DEBUG nova.virt.hardware [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1004.231318] env[65680]: DEBUG nova.virt.hardware [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1004.231541] env[65680]: DEBUG nova.virt.hardware [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1004.231750] env[65680]: DEBUG nova.virt.hardware [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1004.231897] env[65680]: DEBUG nova.virt.hardware [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1004.232087] env[65680]: DEBUG nova.virt.hardware [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1004.232311] env[65680]: DEBUG nova.virt.hardware [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1004.232470] env[65680]: DEBUG nova.virt.hardware [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1004.232634] env[65680]: DEBUG nova.virt.hardware [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1004.232795] env[65680]: DEBUG nova.virt.hardware [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1004.232964] env[65680]: DEBUG nova.virt.hardware [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1004.233893] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32c8d68e-c236-4588-941e-d6c1ed2bfc96 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.242008] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a906a37d-a8df-4d30-a00b-53697f20aadb {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.373710] env[65680]: DEBUG nova.policy [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '25b34d1eb03a4f7e9c78a3e92569dbfa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6731fc8caff545e3bb28c1ba6c407b78', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 1004.723934] env[65680]: DEBUG nova.network.neutron [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Successfully created port: e890ed3f-45ac-4f3e-9611-61b45d4951d2 {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1005.237087] env[65680]: DEBUG nova.compute.manager [req-343e4c33-1c5f-40fd-a357-35b9a88f29a5 req-71a9420e-a4e6-457d-b33a-f0783d1e9bd4 service nova] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Received event network-vif-plugged-e890ed3f-45ac-4f3e-9611-61b45d4951d2 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1005.237087] env[65680]: DEBUG oslo_concurrency.lockutils [req-343e4c33-1c5f-40fd-a357-35b9a88f29a5 req-71a9420e-a4e6-457d-b33a-f0783d1e9bd4 service nova] Acquiring lock "c6953476-8f7a-4314-a88e-cd5d02c3309f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1005.237087] env[65680]: DEBUG oslo_concurrency.lockutils [req-343e4c33-1c5f-40fd-a357-35b9a88f29a5 req-71a9420e-a4e6-457d-b33a-f0783d1e9bd4 service nova] Lock "c6953476-8f7a-4314-a88e-cd5d02c3309f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1005.237087] env[65680]: DEBUG oslo_concurrency.lockutils [req-343e4c33-1c5f-40fd-a357-35b9a88f29a5 req-71a9420e-a4e6-457d-b33a-f0783d1e9bd4 service nova] Lock "c6953476-8f7a-4314-a88e-cd5d02c3309f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1005.237403] env[65680]: DEBUG nova.compute.manager [req-343e4c33-1c5f-40fd-a357-35b9a88f29a5 req-71a9420e-a4e6-457d-b33a-f0783d1e9bd4 service nova] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] No waiting events found dispatching network-vif-plugged-e890ed3f-45ac-4f3e-9611-61b45d4951d2 {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1005.237403] env[65680]: WARNING nova.compute.manager [req-343e4c33-1c5f-40fd-a357-35b9a88f29a5 req-71a9420e-a4e6-457d-b33a-f0783d1e9bd4 service nova] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Received unexpected event network-vif-plugged-e890ed3f-45ac-4f3e-9611-61b45d4951d2 for instance with vm_state building and task_state spawning. [ 1005.308197] env[65680]: DEBUG nova.network.neutron [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Successfully updated port: e890ed3f-45ac-4f3e-9611-61b45d4951d2 {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1005.318343] env[65680]: DEBUG oslo_concurrency.lockutils [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Acquiring lock "refresh_cache-c6953476-8f7a-4314-a88e-cd5d02c3309f" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1005.318476] env[65680]: DEBUG oslo_concurrency.lockutils [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Acquired lock "refresh_cache-c6953476-8f7a-4314-a88e-cd5d02c3309f" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1005.318619] env[65680]: DEBUG nova.network.neutron [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1005.394114] env[65680]: DEBUG nova.network.neutron [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1005.637668] env[65680]: DEBUG nova.network.neutron [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Updating instance_info_cache with network_info: [{"id": "e890ed3f-45ac-4f3e-9611-61b45d4951d2", "address": "fa:16:3e:eb:81:28", "network": {"id": "9f555ec0-232a-45c7-b1e8-790b0d530820", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1951140125-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6731fc8caff545e3bb28c1ba6c407b78", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "163e60bd-32d6-41c5-95e6-2eb10c5c9245", "external-id": "nsx-vlan-transportzone-716", "segmentation_id": 716, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape890ed3f-45", "ovs_interfaceid": "e890ed3f-45ac-4f3e-9611-61b45d4951d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1005.651199] env[65680]: DEBUG oslo_concurrency.lockutils [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Releasing lock "refresh_cache-c6953476-8f7a-4314-a88e-cd5d02c3309f" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1005.651502] env[65680]: DEBUG nova.compute.manager [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Instance network_info: |[{"id": "e890ed3f-45ac-4f3e-9611-61b45d4951d2", "address": "fa:16:3e:eb:81:28", "network": {"id": "9f555ec0-232a-45c7-b1e8-790b0d530820", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1951140125-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6731fc8caff545e3bb28c1ba6c407b78", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "163e60bd-32d6-41c5-95e6-2eb10c5c9245", "external-id": "nsx-vlan-transportzone-716", "segmentation_id": 716, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape890ed3f-45", "ovs_interfaceid": "e890ed3f-45ac-4f3e-9611-61b45d4951d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1005.651899] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:eb:81:28', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '163e60bd-32d6-41c5-95e6-2eb10c5c9245', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e890ed3f-45ac-4f3e-9611-61b45d4951d2', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1005.666523] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Creating folder: Project (6731fc8caff545e3bb28c1ba6c407b78). Parent ref: group-v572532. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1005.667110] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dbec31d3-8d4d-4dc1-85dc-6cc5c02fe060 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1005.682374] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Created folder: Project (6731fc8caff545e3bb28c1ba6c407b78) in parent group-v572532. [ 1005.682645] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Creating folder: Instances. Parent ref: group-v572608. {{(pid=65680) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1005.683044] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5fca2663-ecff-479a-8966-0518014ca5c0 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1005.693451] env[65680]: INFO nova.virt.vmwareapi.vm_util [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Created folder: Instances in parent group-v572608. [ 1005.693749] env[65680]: DEBUG oslo.service.loopingcall [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1005.693938] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1005.694221] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-56b008bb-4647-4fa1-be71-696d86c9ede0 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1005.714249] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1005.714249] env[65680]: value = "task-2847955" [ 1005.714249] env[65680]: _type = "Task" [ 1005.714249] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1005.721278] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847955, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1006.226039] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847955, 'name': CreateVM_Task, 'duration_secs': 0.279302} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1006.226039] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1006.226223] env[65680]: DEBUG oslo_concurrency.lockutils [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1006.226378] env[65680]: DEBUG oslo_concurrency.lockutils [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1006.226704] env[65680]: DEBUG oslo_concurrency.lockutils [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1006.226934] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-09308d31-ea7e-4d32-b321-e5f5ce6fb8c9 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.231250] env[65680]: DEBUG oslo_vmware.api [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Waiting for the task: (returnval){ [ 1006.231250] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52d73f0a-aaf0-c64f-9355-e434efc8303a" [ 1006.231250] env[65680]: _type = "Task" [ 1006.231250] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1006.238704] env[65680]: DEBUG oslo_vmware.api [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52d73f0a-aaf0-c64f-9355-e434efc8303a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1006.742071] env[65680]: DEBUG oslo_concurrency.lockutils [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1006.742343] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1006.742563] env[65680]: DEBUG oslo_concurrency.lockutils [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1007.264406] env[65680]: DEBUG nova.compute.manager [req-9f07ac16-51ac-4f07-954f-e47c9f77c5ff req-a350f5b9-968a-441f-8d09-f04a0daef967 service nova] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Received event network-changed-e890ed3f-45ac-4f3e-9611-61b45d4951d2 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1007.264672] env[65680]: DEBUG nova.compute.manager [req-9f07ac16-51ac-4f07-954f-e47c9f77c5ff req-a350f5b9-968a-441f-8d09-f04a0daef967 service nova] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Refreshing instance network info cache due to event network-changed-e890ed3f-45ac-4f3e-9611-61b45d4951d2. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1007.264790] env[65680]: DEBUG oslo_concurrency.lockutils [req-9f07ac16-51ac-4f07-954f-e47c9f77c5ff req-a350f5b9-968a-441f-8d09-f04a0daef967 service nova] Acquiring lock "refresh_cache-c6953476-8f7a-4314-a88e-cd5d02c3309f" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1007.264933] env[65680]: DEBUG oslo_concurrency.lockutils [req-9f07ac16-51ac-4f07-954f-e47c9f77c5ff req-a350f5b9-968a-441f-8d09-f04a0daef967 service nova] Acquired lock "refresh_cache-c6953476-8f7a-4314-a88e-cd5d02c3309f" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1007.265105] env[65680]: DEBUG nova.network.neutron [req-9f07ac16-51ac-4f07-954f-e47c9f77c5ff req-a350f5b9-968a-441f-8d09-f04a0daef967 service nova] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Refreshing network info cache for port e890ed3f-45ac-4f3e-9611-61b45d4951d2 {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1007.521143] env[65680]: DEBUG nova.network.neutron [req-9f07ac16-51ac-4f07-954f-e47c9f77c5ff req-a350f5b9-968a-441f-8d09-f04a0daef967 service nova] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Updated VIF entry in instance network info cache for port e890ed3f-45ac-4f3e-9611-61b45d4951d2. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1007.521513] env[65680]: DEBUG nova.network.neutron [req-9f07ac16-51ac-4f07-954f-e47c9f77c5ff req-a350f5b9-968a-441f-8d09-f04a0daef967 service nova] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Updating instance_info_cache with network_info: [{"id": "e890ed3f-45ac-4f3e-9611-61b45d4951d2", "address": "fa:16:3e:eb:81:28", "network": {"id": "9f555ec0-232a-45c7-b1e8-790b0d530820", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1951140125-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6731fc8caff545e3bb28c1ba6c407b78", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "163e60bd-32d6-41c5-95e6-2eb10c5c9245", "external-id": "nsx-vlan-transportzone-716", "segmentation_id": 716, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape890ed3f-45", "ovs_interfaceid": "e890ed3f-45ac-4f3e-9611-61b45d4951d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1007.530522] env[65680]: DEBUG oslo_concurrency.lockutils [req-9f07ac16-51ac-4f07-954f-e47c9f77c5ff req-a350f5b9-968a-441f-8d09-f04a0daef967 service nova] Releasing lock "refresh_cache-c6953476-8f7a-4314-a88e-cd5d02c3309f" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1041.751021] env[65680]: WARNING oslo_vmware.rw_handles [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1041.751021] env[65680]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1041.751021] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1041.751021] env[65680]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1041.751021] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1041.751021] env[65680]: ERROR oslo_vmware.rw_handles response.begin() [ 1041.751021] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1041.751021] env[65680]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1041.751021] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1041.751021] env[65680]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1041.751021] env[65680]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1041.751021] env[65680]: ERROR oslo_vmware.rw_handles [ 1041.751021] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Downloaded image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to vmware_temp/8f7013db-efdd-4d57-afaf-f13aede9bcb0/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1041.751720] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Caching image {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1041.751996] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Copying Virtual Disk [datastore1] vmware_temp/8f7013db-efdd-4d57-afaf-f13aede9bcb0/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk to [datastore1] vmware_temp/8f7013db-efdd-4d57-afaf-f13aede9bcb0/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk {{(pid=65680) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1041.752315] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8b77e5ad-c814-4959-bdde-6b5651c781f8 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1041.759867] env[65680]: DEBUG oslo_vmware.api [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Waiting for the task: (returnval){ [ 1041.759867] env[65680]: value = "task-2847956" [ 1041.759867] env[65680]: _type = "Task" [ 1041.759867] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1041.767781] env[65680]: DEBUG oslo_vmware.api [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Task: {'id': task-2847956, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1042.270466] env[65680]: DEBUG oslo_vmware.exceptions [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Fault InvalidArgument not matched. {{(pid=65680) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1042.270674] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1042.271246] env[65680]: ERROR nova.compute.manager [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1042.271246] env[65680]: Faults: ['InvalidArgument'] [ 1042.271246] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Traceback (most recent call last): [ 1042.271246] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1042.271246] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] yield resources [ 1042.271246] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1042.271246] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] self.driver.spawn(context, instance, image_meta, [ 1042.271246] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1042.271246] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1042.271246] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1042.271246] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] self._fetch_image_if_missing(context, vi) [ 1042.271246] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1042.271609] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] image_cache(vi, tmp_image_ds_loc) [ 1042.271609] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1042.271609] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] vm_util.copy_virtual_disk( [ 1042.271609] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1042.271609] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] session._wait_for_task(vmdk_copy_task) [ 1042.271609] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1042.271609] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] return self.wait_for_task(task_ref) [ 1042.271609] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1042.271609] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] return evt.wait() [ 1042.271609] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1042.271609] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] result = hub.switch() [ 1042.271609] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1042.271609] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] return self.greenlet.switch() [ 1042.271939] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1042.271939] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] self.f(*self.args, **self.kw) [ 1042.271939] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1042.271939] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] raise exceptions.translate_fault(task_info.error) [ 1042.271939] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1042.271939] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Faults: ['InvalidArgument'] [ 1042.271939] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] [ 1042.271939] env[65680]: INFO nova.compute.manager [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Terminating instance [ 1042.273689] env[65680]: DEBUG oslo_concurrency.lockutils [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1042.273689] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1042.274422] env[65680]: DEBUG nova.compute.manager [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1042.274614] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1042.274831] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-715cab62-08bb-43ed-a137-7ffcc88c2a62 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.277440] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-863a8e2d-765f-40a2-9f4c-18d3f8abe66d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.283675] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1042.283873] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a0fe0e15-79e8-4088-acae-7e7be00bb160 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.285963] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1042.286151] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1042.287072] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6cd1c9ab-4340-4cb4-bd22-ebe6da6864bc {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.291908] env[65680]: DEBUG oslo_vmware.api [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Waiting for the task: (returnval){ [ 1042.291908] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]5294b4e0-2bae-bd94-04c7-5f60d35f4ae5" [ 1042.291908] env[65680]: _type = "Task" [ 1042.291908] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1042.298526] env[65680]: DEBUG oslo_vmware.api [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]5294b4e0-2bae-bd94-04c7-5f60d35f4ae5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1042.357670] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1042.357884] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1042.358066] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Deleting the datastore file [datastore1] abb69e61-9594-48b5-b3f4-f8ba39f93f0e {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1042.358319] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5978c847-0c53-4384-9dae-8569575602c0 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.364084] env[65680]: DEBUG oslo_vmware.api [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Waiting for the task: (returnval){ [ 1042.364084] env[65680]: value = "task-2847958" [ 1042.364084] env[65680]: _type = "Task" [ 1042.364084] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1042.371338] env[65680]: DEBUG oslo_vmware.api [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Task: {'id': task-2847958, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1042.803749] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1042.804066] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Creating directory with path [datastore1] vmware_temp/e4fe1ac6-59e2-4041-95e3-5778cc3bc27c/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1042.804272] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6f0758e4-0ea3-4232-9de7-7c5fd838d974 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.815875] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Created directory with path [datastore1] vmware_temp/e4fe1ac6-59e2-4041-95e3-5778cc3bc27c/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1042.816075] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Fetch image to [datastore1] vmware_temp/e4fe1ac6-59e2-4041-95e3-5778cc3bc27c/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1042.816276] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/e4fe1ac6-59e2-4041-95e3-5778cc3bc27c/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1042.817046] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66330945-32d8-43c4-851b-945a4eabf2e6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.823704] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aff9d9dd-ea3d-44c1-97b1-ead40ad3534d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.832659] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1b89d2c-0dbd-409c-8824-832dbc59df88 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.863905] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a14b517-1c06-48ea-aa4e-0b38808bae1d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.875032] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-07ea3da9-ab34-489c-abd7-52729637afd8 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.876678] env[65680]: DEBUG oslo_vmware.api [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Task: {'id': task-2847958, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.101358} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1042.876908] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1042.877098] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1042.877279] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1042.877451] env[65680]: INFO nova.compute.manager [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1042.879759] env[65680]: DEBUG nova.compute.claims [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1042.879935] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1042.880987] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1042.898053] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1042.944678] env[65680]: DEBUG oslo_vmware.rw_handles [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e4fe1ac6-59e2-4041-95e3-5778cc3bc27c/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1043.004324] env[65680]: DEBUG oslo_vmware.rw_handles [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Completed reading data from the image iterator. {{(pid=65680) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1043.004521] env[65680]: DEBUG oslo_vmware.rw_handles [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e4fe1ac6-59e2-4041-95e3-5778cc3bc27c/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1043.067491] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad157870-7d25-403a-b8e2-fe5b40922aab {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.074554] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f91f3233-f428-4cf4-a686-d314616e7e40 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.105325] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00d4e059-46db-4fe0-b25a-53b30d2d274c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.111982] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f61be941-6952-4d97-8425-ac20f7c67c1b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.124707] env[65680]: DEBUG nova.compute.provider_tree [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1043.133039] env[65680]: DEBUG nova.scheduler.client.report [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1043.146634] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.266s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1043.147166] env[65680]: ERROR nova.compute.manager [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1043.147166] env[65680]: Faults: ['InvalidArgument'] [ 1043.147166] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Traceback (most recent call last): [ 1043.147166] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1043.147166] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] self.driver.spawn(context, instance, image_meta, [ 1043.147166] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1043.147166] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1043.147166] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1043.147166] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] self._fetch_image_if_missing(context, vi) [ 1043.147166] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1043.147166] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] image_cache(vi, tmp_image_ds_loc) [ 1043.147166] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1043.147507] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] vm_util.copy_virtual_disk( [ 1043.147507] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1043.147507] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] session._wait_for_task(vmdk_copy_task) [ 1043.147507] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1043.147507] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] return self.wait_for_task(task_ref) [ 1043.147507] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1043.147507] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] return evt.wait() [ 1043.147507] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1043.147507] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] result = hub.switch() [ 1043.147507] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1043.147507] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] return self.greenlet.switch() [ 1043.147507] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1043.147507] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] self.f(*self.args, **self.kw) [ 1043.147810] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1043.147810] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] raise exceptions.translate_fault(task_info.error) [ 1043.147810] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1043.147810] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Faults: ['InvalidArgument'] [ 1043.147810] env[65680]: ERROR nova.compute.manager [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] [ 1043.147930] env[65680]: DEBUG nova.compute.utils [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] VimFaultException {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1043.149262] env[65680]: DEBUG nova.compute.manager [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Build of instance abb69e61-9594-48b5-b3f4-f8ba39f93f0e was re-scheduled: A specified parameter was not correct: fileType [ 1043.149262] env[65680]: Faults: ['InvalidArgument'] {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1043.149659] env[65680]: DEBUG nova.compute.manager [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1043.149830] env[65680]: DEBUG nova.compute.manager [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1043.150009] env[65680]: DEBUG nova.compute.manager [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1043.150175] env[65680]: DEBUG nova.network.neutron [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1043.440073] env[65680]: DEBUG nova.network.neutron [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1043.451086] env[65680]: INFO nova.compute.manager [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: abb69e61-9594-48b5-b3f4-f8ba39f93f0e] Took 0.30 seconds to deallocate network for instance. [ 1043.544844] env[65680]: INFO nova.scheduler.client.report [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Deleted allocations for instance abb69e61-9594-48b5-b3f4-f8ba39f93f0e [ 1043.560505] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2aae3c65-1bcd-42f8-a115-ca022c91a237 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "abb69e61-9594-48b5-b3f4-f8ba39f93f0e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 170.508s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1045.162198] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquiring lock "dac6ccec-b1a2-47d5-9750-d5f59f6743ae" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1045.162503] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "dac6ccec-b1a2-47d5-9750-d5f59f6743ae" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1045.173999] env[65680]: DEBUG nova.compute.manager [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1045.218799] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1045.219044] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1045.220403] env[65680]: INFO nova.compute.claims [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1045.349643] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b527cc5-f68e-4f96-af93-7eecfb88e9f0 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.356979] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1d330af-442a-4810-a0b7-6c8fa584d56f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.385372] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5cbdfa8c-c7b8-4cc8-b5ed-5e226d228c43 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.392282] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-adbca28b-4648-4f5e-af73-a956b1edabde {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.405572] env[65680]: DEBUG nova.compute.provider_tree [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1045.413855] env[65680]: DEBUG nova.scheduler.client.report [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1045.426848] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.208s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1045.427284] env[65680]: DEBUG nova.compute.manager [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1045.460242] env[65680]: DEBUG nova.compute.utils [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1045.461409] env[65680]: DEBUG nova.compute.manager [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1045.461579] env[65680]: DEBUG nova.network.neutron [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1045.470610] env[65680]: DEBUG nova.compute.manager [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1045.521819] env[65680]: DEBUG nova.policy [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0c0a4078f7644f57884a39d3369ceb7e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4ec0d6e13ecf4b72b79052a4077a754f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 1045.529805] env[65680]: DEBUG nova.compute.manager [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1045.550279] env[65680]: DEBUG nova.virt.hardware [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1045.550512] env[65680]: DEBUG nova.virt.hardware [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1045.550683] env[65680]: DEBUG nova.virt.hardware [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1045.550867] env[65680]: DEBUG nova.virt.hardware [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1045.551025] env[65680]: DEBUG nova.virt.hardware [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1045.551967] env[65680]: DEBUG nova.virt.hardware [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1045.551967] env[65680]: DEBUG nova.virt.hardware [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1045.551967] env[65680]: DEBUG nova.virt.hardware [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1045.551967] env[65680]: DEBUG nova.virt.hardware [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1045.551967] env[65680]: DEBUG nova.virt.hardware [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1045.552228] env[65680]: DEBUG nova.virt.hardware [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1045.554169] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0cdff99c-7d19-4718-80cb-ccd82ba8b6f7 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.560929] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca20ef4e-c761-4d8f-b3c2-391a6349a034 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.788578] env[65680]: DEBUG nova.network.neutron [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Successfully created port: 2ead27b4-9436-424e-8408-0ddad922c076 {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1046.292434] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1046.292726] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1046.392492] env[65680]: DEBUG nova.compute.manager [req-0e1b4d97-a892-4efa-92c5-510074856b05 req-fcb7f96c-3687-4915-8d16-f6b96e937b02 service nova] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Received event network-vif-plugged-2ead27b4-9436-424e-8408-0ddad922c076 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1046.392711] env[65680]: DEBUG oslo_concurrency.lockutils [req-0e1b4d97-a892-4efa-92c5-510074856b05 req-fcb7f96c-3687-4915-8d16-f6b96e937b02 service nova] Acquiring lock "dac6ccec-b1a2-47d5-9750-d5f59f6743ae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1046.392933] env[65680]: DEBUG oslo_concurrency.lockutils [req-0e1b4d97-a892-4efa-92c5-510074856b05 req-fcb7f96c-3687-4915-8d16-f6b96e937b02 service nova] Lock "dac6ccec-b1a2-47d5-9750-d5f59f6743ae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1046.393093] env[65680]: DEBUG oslo_concurrency.lockutils [req-0e1b4d97-a892-4efa-92c5-510074856b05 req-fcb7f96c-3687-4915-8d16-f6b96e937b02 service nova] Lock "dac6ccec-b1a2-47d5-9750-d5f59f6743ae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1046.393292] env[65680]: DEBUG nova.compute.manager [req-0e1b4d97-a892-4efa-92c5-510074856b05 req-fcb7f96c-3687-4915-8d16-f6b96e937b02 service nova] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] No waiting events found dispatching network-vif-plugged-2ead27b4-9436-424e-8408-0ddad922c076 {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1046.393416] env[65680]: WARNING nova.compute.manager [req-0e1b4d97-a892-4efa-92c5-510074856b05 req-fcb7f96c-3687-4915-8d16-f6b96e937b02 service nova] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Received unexpected event network-vif-plugged-2ead27b4-9436-424e-8408-0ddad922c076 for instance with vm_state building and task_state spawning. [ 1046.466139] env[65680]: DEBUG nova.network.neutron [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Successfully updated port: 2ead27b4-9436-424e-8408-0ddad922c076 {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1046.475060] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquiring lock "refresh_cache-dac6ccec-b1a2-47d5-9750-d5f59f6743ae" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1046.475279] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquired lock "refresh_cache-dac6ccec-b1a2-47d5-9750-d5f59f6743ae" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1046.475498] env[65680]: DEBUG nova.network.neutron [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1046.508267] env[65680]: DEBUG nova.network.neutron [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1046.850433] env[65680]: DEBUG nova.network.neutron [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Updating instance_info_cache with network_info: [{"id": "2ead27b4-9436-424e-8408-0ddad922c076", "address": "fa:16:3e:d6:cd:a2", "network": {"id": "88155a8f-de41-4a3d-8802-3a6c5765e431", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-792625309-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ec0d6e13ecf4b72b79052a4077a754f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a06a63d6-2aeb-4084-8022-f804cac3fa74", "external-id": "nsx-vlan-transportzone-797", "segmentation_id": 797, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2ead27b4-94", "ovs_interfaceid": "2ead27b4-9436-424e-8408-0ddad922c076", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1046.863596] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Releasing lock "refresh_cache-dac6ccec-b1a2-47d5-9750-d5f59f6743ae" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1046.863890] env[65680]: DEBUG nova.compute.manager [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Instance network_info: |[{"id": "2ead27b4-9436-424e-8408-0ddad922c076", "address": "fa:16:3e:d6:cd:a2", "network": {"id": "88155a8f-de41-4a3d-8802-3a6c5765e431", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-792625309-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ec0d6e13ecf4b72b79052a4077a754f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a06a63d6-2aeb-4084-8022-f804cac3fa74", "external-id": "nsx-vlan-transportzone-797", "segmentation_id": 797, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2ead27b4-94", "ovs_interfaceid": "2ead27b4-9436-424e-8408-0ddad922c076", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1046.864353] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d6:cd:a2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a06a63d6-2aeb-4084-8022-f804cac3fa74', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2ead27b4-9436-424e-8408-0ddad922c076', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1046.871764] env[65680]: DEBUG oslo.service.loopingcall [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1046.872222] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1046.872439] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4542dd28-123e-48fc-b7ee-75380987edb6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1046.892047] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1046.892047] env[65680]: value = "task-2847959" [ 1046.892047] env[65680]: _type = "Task" [ 1046.892047] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1046.899279] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847959, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1047.292874] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1047.293193] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Starting heal instance info cache {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1047.293193] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Rebuilding the list of instances to heal {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1047.310372] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1047.310522] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1047.310654] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1047.310779] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1047.310900] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1047.311147] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1047.311341] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1047.311469] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1047.311589] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Didn't find any instances for network info cache update. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1047.402390] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847959, 'name': CreateVM_Task, 'duration_secs': 0.301135} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1047.402543] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1047.403227] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1047.403392] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1047.403733] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1047.404014] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-84f82865-62bd-4051-bc4a-28fe5a124bfa {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.408520] env[65680]: DEBUG oslo_vmware.api [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Waiting for the task: (returnval){ [ 1047.408520] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]523c4bba-8df4-6a0e-17e7-00a28806674f" [ 1047.408520] env[65680]: _type = "Task" [ 1047.408520] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1047.416793] env[65680]: DEBUG oslo_vmware.api [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]523c4bba-8df4-6a0e-17e7-00a28806674f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1047.918596] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1047.918852] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1047.919079] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1048.417254] env[65680]: DEBUG nova.compute.manager [req-7f1d7c4f-e2e7-47af-b673-c816c255bb1b req-656e5640-f3a3-42a5-91d2-e5713225cf56 service nova] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Received event network-changed-2ead27b4-9436-424e-8408-0ddad922c076 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1048.417511] env[65680]: DEBUG nova.compute.manager [req-7f1d7c4f-e2e7-47af-b673-c816c255bb1b req-656e5640-f3a3-42a5-91d2-e5713225cf56 service nova] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Refreshing instance network info cache due to event network-changed-2ead27b4-9436-424e-8408-0ddad922c076. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1048.417635] env[65680]: DEBUG oslo_concurrency.lockutils [req-7f1d7c4f-e2e7-47af-b673-c816c255bb1b req-656e5640-f3a3-42a5-91d2-e5713225cf56 service nova] Acquiring lock "refresh_cache-dac6ccec-b1a2-47d5-9750-d5f59f6743ae" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1048.417802] env[65680]: DEBUG oslo_concurrency.lockutils [req-7f1d7c4f-e2e7-47af-b673-c816c255bb1b req-656e5640-f3a3-42a5-91d2-e5713225cf56 service nova] Acquired lock "refresh_cache-dac6ccec-b1a2-47d5-9750-d5f59f6743ae" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1048.417974] env[65680]: DEBUG nova.network.neutron [req-7f1d7c4f-e2e7-47af-b673-c816c255bb1b req-656e5640-f3a3-42a5-91d2-e5713225cf56 service nova] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Refreshing network info cache for port 2ead27b4-9436-424e-8408-0ddad922c076 {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1048.644427] env[65680]: DEBUG nova.network.neutron [req-7f1d7c4f-e2e7-47af-b673-c816c255bb1b req-656e5640-f3a3-42a5-91d2-e5713225cf56 service nova] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Updated VIF entry in instance network info cache for port 2ead27b4-9436-424e-8408-0ddad922c076. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1048.644753] env[65680]: DEBUG nova.network.neutron [req-7f1d7c4f-e2e7-47af-b673-c816c255bb1b req-656e5640-f3a3-42a5-91d2-e5713225cf56 service nova] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Updating instance_info_cache with network_info: [{"id": "2ead27b4-9436-424e-8408-0ddad922c076", "address": "fa:16:3e:d6:cd:a2", "network": {"id": "88155a8f-de41-4a3d-8802-3a6c5765e431", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-792625309-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ec0d6e13ecf4b72b79052a4077a754f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a06a63d6-2aeb-4084-8022-f804cac3fa74", "external-id": "nsx-vlan-transportzone-797", "segmentation_id": 797, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2ead27b4-94", "ovs_interfaceid": "2ead27b4-9436-424e-8408-0ddad922c076", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1048.653829] env[65680]: DEBUG oslo_concurrency.lockutils [req-7f1d7c4f-e2e7-47af-b673-c816c255bb1b req-656e5640-f3a3-42a5-91d2-e5713225cf56 service nova] Releasing lock "refresh_cache-dac6ccec-b1a2-47d5-9750-d5f59f6743ae" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1049.293096] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1049.293297] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65680) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1052.293838] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1052.294150] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1052.294327] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1052.294524] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1053.293495] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager.update_available_resource {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1053.303039] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1053.303346] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1053.303445] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1053.303561] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65680) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1053.304629] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2effeef2-d5b7-46ee-91eb-d61da778bae8 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1053.314461] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ce1b3cf-f40e-4c08-a982-268c31333c85 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1053.328017] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15e6b2a4-ac74-42e4-8bc5-ecc12153418a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1053.334177] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21818cf2-f3f1-4938-a946-9b0d663e8d8e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1053.362406] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181053MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65680) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1053.362554] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1053.362744] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1053.431623] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 05ef6eca-eb64-43b3-8c7d-b5a230282a8f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1053.431723] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 8b747838-fcd0-494c-bd5a-0e5b1950a44e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1053.431884] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 01e82211-1de5-44ad-b14e-81a54470d4e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1053.431966] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance dd382edd-abe8-4764-a9d5-4144ef7d50b0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1053.432112] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance c9230f1c-72ea-4f62-be9f-949def49c5f4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1053.432248] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 132e6039-55dc-4118-bcd5-d32557743981 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1053.432365] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance c6953476-8f7a-4314-a88e-cd5d02c3309f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1053.432477] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance dac6ccec-b1a2-47d5-9750-d5f59f6743ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1053.432667] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1053.432803] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1053.537814] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee190161-5613-454d-98cd-c71fe58d0de6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1053.545466] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b884040-ed6d-4408-8d46-d5246260e559 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1053.575954] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66020930-d09f-482e-9725-929aaeb933f8 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1053.582898] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e3643d2-b149-4d04-9898-5ce07002e939 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1053.595671] env[65680]: DEBUG nova.compute.provider_tree [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1053.604183] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1053.618640] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65680) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1053.618821] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.256s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1070.480235] env[65680]: DEBUG nova.compute.manager [req-d4403246-f8f8-43b0-afe2-1b73b011a1ae req-ba41dafb-5acb-4602-a0b4-14927fb776b2 service nova] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Received event network-vif-deleted-1dae8dd6-14d6-4b7e-b7b6-e9caf63340f9 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1070.905814] env[65680]: DEBUG nova.compute.manager [req-a22e3d9b-f0df-48e3-ac7b-fd1444b5b349 req-91ea87b4-97a2-421a-9110-a42352baa8bd service nova] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Received event network-vif-deleted-908eef60-29d5-4d72-9c39-6c2782adcb09 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1072.543191] env[65680]: DEBUG nova.compute.manager [req-0f1698d5-8825-4205-a2c6-085ee399140c req-6812fcab-d365-4bee-8b7c-64a414d41387 service nova] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Received event network-vif-deleted-d6d14cac-0618-4f2a-b8a3-caa176d3931c {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1072.933987] env[65680]: DEBUG nova.compute.manager [req-ad63fc5f-830b-4a79-a05d-6e3c1e989c93 req-386c99d8-ac69-4d80-b88c-96427e6b68ae service nova] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Received event network-vif-deleted-c953efe3-8348-4fd0-a558-0913fd2880d2 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1075.580387] env[65680]: DEBUG nova.compute.manager [req-a5515139-b504-4821-8468-de1976e0d0d5 req-aca0b231-dc92-4b11-9d80-e2285c76d3e5 service nova] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Received event network-vif-deleted-01907b62-4b40-4f64-8f92-89a1184281ff {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1076.806319] env[65680]: DEBUG nova.compute.manager [req-abeae99c-6468-41c1-99af-179e6433fe26 req-0501ae2b-3dfb-4e41-aefe-17ab168b15f7 service nova] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Received event network-vif-deleted-47906351-0c3f-4c76-9e2f-d586423efb6e {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1089.956104] env[65680]: WARNING oslo_vmware.rw_handles [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1089.956104] env[65680]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1089.956104] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1089.956104] env[65680]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1089.956104] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1089.956104] env[65680]: ERROR oslo_vmware.rw_handles response.begin() [ 1089.956104] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1089.956104] env[65680]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1089.956104] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1089.956104] env[65680]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1089.956104] env[65680]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1089.956104] env[65680]: ERROR oslo_vmware.rw_handles [ 1089.956909] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Downloaded image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to vmware_temp/e4fe1ac6-59e2-4041-95e3-5778cc3bc27c/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1089.958456] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Caching image {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1089.958711] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Copying Virtual Disk [datastore1] vmware_temp/e4fe1ac6-59e2-4041-95e3-5778cc3bc27c/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk to [datastore1] vmware_temp/e4fe1ac6-59e2-4041-95e3-5778cc3bc27c/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk {{(pid=65680) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1089.958989] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c68170da-9144-47ae-a1d4-7c41e17cd49e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1089.968310] env[65680]: DEBUG oslo_vmware.api [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Waiting for the task: (returnval){ [ 1089.968310] env[65680]: value = "task-2847960" [ 1089.968310] env[65680]: _type = "Task" [ 1089.968310] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1089.976109] env[65680]: DEBUG oslo_vmware.api [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Task: {'id': task-2847960, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1090.478427] env[65680]: DEBUG oslo_vmware.exceptions [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Fault InvalidArgument not matched. {{(pid=65680) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1090.478666] env[65680]: DEBUG oslo_concurrency.lockutils [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1090.479250] env[65680]: ERROR nova.compute.manager [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1090.479250] env[65680]: Faults: ['InvalidArgument'] [ 1090.479250] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Traceback (most recent call last): [ 1090.479250] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1090.479250] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] yield resources [ 1090.479250] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1090.479250] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] self.driver.spawn(context, instance, image_meta, [ 1090.479250] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1090.479250] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1090.479250] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1090.479250] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] self._fetch_image_if_missing(context, vi) [ 1090.479250] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1090.479611] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] image_cache(vi, tmp_image_ds_loc) [ 1090.479611] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1090.479611] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] vm_util.copy_virtual_disk( [ 1090.479611] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1090.479611] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] session._wait_for_task(vmdk_copy_task) [ 1090.479611] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1090.479611] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] return self.wait_for_task(task_ref) [ 1090.479611] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1090.479611] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] return evt.wait() [ 1090.479611] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1090.479611] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] result = hub.switch() [ 1090.479611] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1090.479611] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] return self.greenlet.switch() [ 1090.479932] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1090.479932] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] self.f(*self.args, **self.kw) [ 1090.479932] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1090.479932] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] raise exceptions.translate_fault(task_info.error) [ 1090.479932] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1090.479932] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Faults: ['InvalidArgument'] [ 1090.479932] env[65680]: ERROR nova.compute.manager [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] [ 1090.479932] env[65680]: INFO nova.compute.manager [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Terminating instance [ 1090.481604] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1090.481604] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1090.481997] env[65680]: DEBUG nova.compute.manager [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1090.482198] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1090.482462] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-47f1e7b1-e2a3-4f4b-9404-c1531dafe920 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1090.484751] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3722d4d8-cbcc-475e-bf15-5f206d7dde79 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1090.491593] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1090.491804] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9806eafc-f791-4e49-87fe-adf24c3ecde6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1090.494183] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1090.494350] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1090.495343] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-38b76441-21a5-41c7-8d4d-c62752c4fdd2 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1090.500064] env[65680]: DEBUG oslo_vmware.api [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Waiting for the task: (returnval){ [ 1090.500064] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]5205966e-2b6c-50f4-dc44-64a769fc7d11" [ 1090.500064] env[65680]: _type = "Task" [ 1090.500064] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1090.507366] env[65680]: DEBUG oslo_vmware.api [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]5205966e-2b6c-50f4-dc44-64a769fc7d11, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1090.557276] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1090.557532] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1090.557694] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Deleting the datastore file [datastore1] 05ef6eca-eb64-43b3-8c7d-b5a230282a8f {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1090.557982] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-30f63196-ab34-4459-a1ac-81302ddb8766 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1090.564562] env[65680]: DEBUG oslo_vmware.api [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Waiting for the task: (returnval){ [ 1090.564562] env[65680]: value = "task-2847962" [ 1090.564562] env[65680]: _type = "Task" [ 1090.564562] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1090.572137] env[65680]: DEBUG oslo_vmware.api [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Task: {'id': task-2847962, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1091.011425] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1091.011756] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Creating directory with path [datastore1] vmware_temp/e244e50b-0316-4924-a303-28dde85339dc/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1091.011906] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a678131a-1295-4f9e-931b-ac634e533b1e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.022828] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Created directory with path [datastore1] vmware_temp/e244e50b-0316-4924-a303-28dde85339dc/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1091.023020] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Fetch image to [datastore1] vmware_temp/e244e50b-0316-4924-a303-28dde85339dc/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1091.023184] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/e244e50b-0316-4924-a303-28dde85339dc/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1091.023945] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6da4cf45-37d8-4837-bc4b-08581aba97ca {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.033463] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44480122-639a-4215-905f-50a0e3039e6a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.043108] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11d77406-0d92-401f-b028-ee25c02b5809 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.076133] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-616f3321-e1ce-48d8-89e3-8e0cac5f908f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.083102] env[65680]: DEBUG oslo_vmware.api [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Task: {'id': task-2847962, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070541} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1091.084593] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1091.084794] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1091.084963] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1091.085151] env[65680]: INFO nova.compute.manager [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1091.087183] env[65680]: DEBUG nova.compute.claims [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1091.087352] env[65680]: DEBUG oslo_concurrency.lockutils [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1091.087557] env[65680]: DEBUG oslo_concurrency.lockutils [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1091.090281] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9e5dfd05-768e-4c2b-818d-d31eceb770d8 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.111261] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1091.116146] env[65680]: DEBUG oslo_concurrency.lockutils [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.028s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1091.116883] env[65680]: DEBUG nova.compute.utils [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Instance 05ef6eca-eb64-43b3-8c7d-b5a230282a8f could not be found. {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1091.118583] env[65680]: DEBUG nova.compute.manager [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Instance disappeared during build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1091.119016] env[65680]: DEBUG nova.compute.manager [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1091.119016] env[65680]: DEBUG nova.compute.manager [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1091.119162] env[65680]: DEBUG nova.compute.manager [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1091.119323] env[65680]: DEBUG nova.network.neutron [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1091.146100] env[65680]: DEBUG nova.network.neutron [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1091.158062] env[65680]: INFO nova.compute.manager [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Took 0.04 seconds to deallocate network for instance. [ 1091.172549] env[65680]: DEBUG oslo_vmware.rw_handles [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e244e50b-0316-4924-a303-28dde85339dc/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1091.232640] env[65680]: DEBUG oslo_vmware.rw_handles [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Completed reading data from the image iterator. {{(pid=65680) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1091.232816] env[65680]: DEBUG oslo_vmware.rw_handles [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e244e50b-0316-4924-a303-28dde85339dc/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1091.249058] env[65680]: DEBUG oslo_concurrency.lockutils [None req-73256141-fcc7-4312-9ea5-103186c667fe tempest-AttachVolumeShelveTestJSON-1895983947 tempest-AttachVolumeShelveTestJSON-1895983947-project-member] Lock "05ef6eca-eb64-43b3-8c7d-b5a230282a8f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 216.869s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1102.292986] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1102.293337] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Cleaning up deleted instances with incomplete migration {{(pid=65680) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 1106.297420] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1106.297786] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1106.297827] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1106.297956] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Cleaning up deleted instances {{(pid=65680) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 1106.330426] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] There are 13 instances to clean {{(pid=65680) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 1106.330585] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Instance has had 0 of 5 cleanup attempts {{(pid=65680) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1106.351553] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Instance has had 0 of 5 cleanup attempts {{(pid=65680) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1106.384855] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Instance has had 0 of 5 cleanup attempts {{(pid=65680) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1106.405624] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Instance has had 0 of 5 cleanup attempts {{(pid=65680) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1106.423230] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Instance has had 0 of 5 cleanup attempts {{(pid=65680) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1106.443744] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 05ef6eca-eb64-43b3-8c7d-b5a230282a8f] Instance has had 0 of 5 cleanup attempts {{(pid=65680) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1106.462777] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: cb739449-a329-41b8-964c-8c9db383e846] Instance has had 0 of 5 cleanup attempts {{(pid=65680) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1106.482634] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: b935e1a7-1c77-4398-a964-cd7da312fc1b] Instance has had 0 of 5 cleanup attempts {{(pid=65680) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1106.501817] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: b163d5b8-b01c-4ace-96e7-56276ab4ba82] Instance has had 0 of 5 cleanup attempts {{(pid=65680) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1106.522424] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 2f6ce1b8-d869-4219-851a-43ae3ddd3816] Instance has had 0 of 5 cleanup attempts {{(pid=65680) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1106.541889] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 40a7ee3c-8627-47f3-887e-31112586e799] Instance has had 0 of 5 cleanup attempts {{(pid=65680) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1106.562558] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: f989cbee-9d5c-459f-b7a0-bf2259dadbb0] Instance has had 0 of 5 cleanup attempts {{(pid=65680) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1106.580742] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: f05204a0-268f-4d77-a2bf-cde4ee02915e] Instance has had 0 of 5 cleanup attempts {{(pid=65680) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1108.594404] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1108.594818] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Starting heal instance info cache {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1108.594818] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Rebuilding the list of instances to heal {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1108.606337] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1108.606504] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1108.606643] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Didn't find any instances for network info cache update. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1111.293675] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1111.293977] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65680) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1112.293439] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1114.293067] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1114.293328] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1114.293482] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1114.293636] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager.update_available_resource {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1114.302673] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1114.302883] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1114.303056] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1114.303214] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65680) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1114.304249] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84a31ef2-0d73-4632-8a5d-40e5089a0b51 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1114.313274] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72286bcb-a86e-4b22-9a0c-7730241c4abe {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1114.326935] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-266b1507-709c-4d5a-90cb-bb46e0823ae2 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1114.332773] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c22acfb-8203-4b08-bbeb-6db8637f0aa5 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1114.360839] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181021MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65680) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1114.360979] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1114.361178] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1114.489083] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance c6953476-8f7a-4314-a88e-cd5d02c3309f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1114.489083] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance dac6ccec-b1a2-47d5-9750-d5f59f6743ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1114.489083] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1114.489267] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1114.503853] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Refreshing inventories for resource provider 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1114.515961] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Updating ProviderTree inventory for provider 93ae29e4-bd04-4c19-80be-8057217cf400 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1114.516195] env[65680]: DEBUG nova.compute.provider_tree [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Updating inventory in ProviderTree for provider 93ae29e4-bd04-4c19-80be-8057217cf400 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1114.526170] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Refreshing aggregate associations for resource provider 93ae29e4-bd04-4c19-80be-8057217cf400, aggregates: None {{(pid=65680) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1114.540426] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Refreshing trait associations for resource provider 93ae29e4-bd04-4c19-80be-8057217cf400, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=65680) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1114.570863] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb194dd0-c0eb-4384-bede-1d357fd9d54f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1114.578018] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c114018-7a15-47b1-ac1b-8cdab7a476a0 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1114.606485] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d14d84e3-ea3e-4bdc-9efa-3b93b067db33 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1114.612881] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55af1d08-eaac-41ab-942f-2ac6187fda6c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1114.625291] env[65680]: DEBUG nova.compute.provider_tree [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1114.633289] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1114.645913] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65680) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1114.646101] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.285s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1116.293494] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1116.306348] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1139.974455] env[65680]: WARNING oslo_vmware.rw_handles [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1139.974455] env[65680]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1139.974455] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1139.974455] env[65680]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1139.974455] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1139.974455] env[65680]: ERROR oslo_vmware.rw_handles response.begin() [ 1139.974455] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1139.974455] env[65680]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1139.974455] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1139.974455] env[65680]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1139.974455] env[65680]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1139.974455] env[65680]: ERROR oslo_vmware.rw_handles [ 1139.975330] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Downloaded image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to vmware_temp/e244e50b-0316-4924-a303-28dde85339dc/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1139.976800] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Caching image {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1139.977067] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Copying Virtual Disk [datastore1] vmware_temp/e244e50b-0316-4924-a303-28dde85339dc/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk to [datastore1] vmware_temp/e244e50b-0316-4924-a303-28dde85339dc/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk {{(pid=65680) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1139.977350] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4c8f9bb6-0816-4079-becc-e6ff04c4149e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1139.986651] env[65680]: DEBUG oslo_vmware.api [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Waiting for the task: (returnval){ [ 1139.986651] env[65680]: value = "task-2847963" [ 1139.986651] env[65680]: _type = "Task" [ 1139.986651] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1139.994439] env[65680]: DEBUG oslo_vmware.api [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Task: {'id': task-2847963, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1140.497220] env[65680]: DEBUG oslo_vmware.exceptions [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Fault InvalidArgument not matched. {{(pid=65680) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1140.497464] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1140.497993] env[65680]: ERROR nova.compute.manager [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1140.497993] env[65680]: Faults: ['InvalidArgument'] [ 1140.497993] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Traceback (most recent call last): [ 1140.497993] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1140.497993] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] yield resources [ 1140.497993] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1140.497993] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] self.driver.spawn(context, instance, image_meta, [ 1140.497993] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1140.497993] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1140.497993] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1140.497993] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] self._fetch_image_if_missing(context, vi) [ 1140.497993] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1140.498534] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] image_cache(vi, tmp_image_ds_loc) [ 1140.498534] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1140.498534] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] vm_util.copy_virtual_disk( [ 1140.498534] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1140.498534] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] session._wait_for_task(vmdk_copy_task) [ 1140.498534] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1140.498534] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] return self.wait_for_task(task_ref) [ 1140.498534] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1140.498534] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] return evt.wait() [ 1140.498534] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1140.498534] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] result = hub.switch() [ 1140.498534] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1140.498534] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] return self.greenlet.switch() [ 1140.498937] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1140.498937] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] self.f(*self.args, **self.kw) [ 1140.498937] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1140.498937] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] raise exceptions.translate_fault(task_info.error) [ 1140.498937] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1140.498937] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Faults: ['InvalidArgument'] [ 1140.498937] env[65680]: ERROR nova.compute.manager [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] [ 1140.498937] env[65680]: INFO nova.compute.manager [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Terminating instance [ 1140.499858] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1140.500074] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1140.500308] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8ad131a4-ad44-4ab7-a511-fb81983fdd9f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1140.502567] env[65680]: DEBUG nova.compute.manager [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1140.502753] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1140.503452] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-301c5ac1-0bb7-45f9-82aa-6861415bbff6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1140.509728] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1140.509917] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-10422608-509c-4d68-8f60-5064ccebfd45 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1140.511852] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1140.512035] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1140.512926] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e1651a17-0bf7-4334-a1bd-e8e4bd93a9c9 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1140.517436] env[65680]: DEBUG oslo_vmware.api [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Waiting for the task: (returnval){ [ 1140.517436] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]525c4a37-096f-1a04-8d18-e81c1f553033" [ 1140.517436] env[65680]: _type = "Task" [ 1140.517436] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1140.524345] env[65680]: DEBUG oslo_vmware.api [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]525c4a37-096f-1a04-8d18-e81c1f553033, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1140.577254] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1140.577469] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1140.577648] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Deleting the datastore file [datastore1] dd382edd-abe8-4764-a9d5-4144ef7d50b0 {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1140.578142] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6cced9b5-98bc-48b1-a178-c5745d061ab8 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1140.584096] env[65680]: DEBUG oslo_vmware.api [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Waiting for the task: (returnval){ [ 1140.584096] env[65680]: value = "task-2847965" [ 1140.584096] env[65680]: _type = "Task" [ 1140.584096] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1140.591845] env[65680]: DEBUG oslo_vmware.api [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Task: {'id': task-2847965, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1141.027341] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1141.027603] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Creating directory with path [datastore1] vmware_temp/59ff6527-61d7-454e-b277-27667dda5d46/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1141.027822] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6a316126-1558-4e38-9bba-64406f00c328 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1141.039028] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Created directory with path [datastore1] vmware_temp/59ff6527-61d7-454e-b277-27667dda5d46/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1141.039162] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Fetch image to [datastore1] vmware_temp/59ff6527-61d7-454e-b277-27667dda5d46/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1141.039331] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/59ff6527-61d7-454e-b277-27667dda5d46/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1141.040023] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe806150-f32a-4f64-aedf-56d85da33ba5 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1141.046158] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50db509f-5ef2-4780-b15d-4b8f07c5a6ec {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1141.054869] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a631c29-dfd7-48ba-9424-f927a0e763c1 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1141.084237] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-230aa4e6-72d9-472e-b087-9981053c377e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1141.094460] env[65680]: DEBUG oslo_vmware.api [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Task: {'id': task-2847965, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07474} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1141.095549] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1141.095736] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1141.095905] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1141.096088] env[65680]: INFO nova.compute.manager [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1141.097787] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-25155325-0fae-4c5f-b7f9-180418b0545c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1141.099580] env[65680]: DEBUG nova.compute.claims [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1141.099752] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1141.099957] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1141.119928] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1141.123723] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1141.124357] env[65680]: DEBUG nova.compute.utils [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Instance dd382edd-abe8-4764-a9d5-4144ef7d50b0 could not be found. {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1141.125772] env[65680]: DEBUG nova.compute.manager [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Instance disappeared during build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1141.125930] env[65680]: DEBUG nova.compute.manager [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1141.126102] env[65680]: DEBUG nova.compute.manager [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1141.126272] env[65680]: DEBUG nova.compute.manager [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1141.126426] env[65680]: DEBUG nova.network.neutron [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1141.148948] env[65680]: DEBUG nova.network.neutron [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1141.157931] env[65680]: INFO nova.compute.manager [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] [instance: dd382edd-abe8-4764-a9d5-4144ef7d50b0] Took 0.03 seconds to deallocate network for instance. [ 1141.166069] env[65680]: DEBUG oslo_vmware.rw_handles [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/59ff6527-61d7-454e-b277-27667dda5d46/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1141.223444] env[65680]: DEBUG oslo_vmware.rw_handles [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Completed reading data from the image iterator. {{(pid=65680) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1141.223651] env[65680]: DEBUG oslo_vmware.rw_handles [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/59ff6527-61d7-454e-b277-27667dda5d46/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1141.240662] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2db1deac-9149-49e1-9939-caa67391091c tempest-ServersNegativeTestMultiTenantJSON-124619170 tempest-ServersNegativeTestMultiTenantJSON-124619170-project-member] Lock "dd382edd-abe8-4764-a9d5-4144ef7d50b0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 265.706s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1158.432592] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._sync_power_states {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1158.444331] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Getting list of instances from cluster (obj){ [ 1158.444331] env[65680]: value = "domain-c8" [ 1158.444331] env[65680]: _type = "ClusterComputeResource" [ 1158.444331] env[65680]: } {{(pid=65680) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1158.445614] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea240264-eeb4-47a9-beb2-e796f39f3a85 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1158.459406] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Got total of 6 instances {{(pid=65680) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1158.459560] env[65680]: WARNING nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] While synchronizing instance power states, found 2 instances in the database and 6 instances on the hypervisor. [ 1158.459697] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Triggering sync for uuid c6953476-8f7a-4314-a88e-cd5d02c3309f {{(pid=65680) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10224}} [ 1158.459878] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Triggering sync for uuid dac6ccec-b1a2-47d5-9750-d5f59f6743ae {{(pid=65680) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10224}} [ 1158.460188] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "c6953476-8f7a-4314-a88e-cd5d02c3309f" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1158.460410] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "dac6ccec-b1a2-47d5-9750-d5f59f6743ae" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1166.315292] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1166.315703] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1168.294074] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1168.294426] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Starting heal instance info cache {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1168.294426] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Rebuilding the list of instances to heal {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1168.305840] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1168.306022] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1168.306132] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Didn't find any instances for network info cache update. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1171.294052] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1171.294352] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65680) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1174.293610] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1174.293983] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1174.293983] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1176.294223] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1176.294620] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager.update_available_resource {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1176.304232] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1176.304432] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1176.304592] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1176.304741] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65680) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1176.305796] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e598fe43-e163-4b04-b31b-198d7920abe1 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1176.314264] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fbd83a8-c6db-4a90-a348-3ccc05100f9a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1176.327425] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71f047c8-8b8f-4f2f-b7ef-f0c195dd0034 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1176.333261] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-deb72782-391f-42ae-8898-e3739f0db148 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1176.362412] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181069MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65680) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1176.362541] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1176.362717] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1176.400947] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance c6953476-8f7a-4314-a88e-cd5d02c3309f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1176.401110] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance dac6ccec-b1a2-47d5-9750-d5f59f6743ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1176.401279] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1176.401415] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1176.434918] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-972283e1-c0a1-43b8-9fd2-6836ec57caeb {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1176.441856] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-050b76c8-ebef-4381-97e8-c3a3d932877c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1176.470539] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b2a241f-2e50-4d60-a510-ec2d91cba21f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1176.477288] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed51296b-469f-4b43-b01c-0c16e02f6f22 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1176.490918] env[65680]: DEBUG nova.compute.provider_tree [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1176.498651] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1176.510733] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65680) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1176.510903] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.148s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1186.976074] env[65680]: WARNING oslo_vmware.rw_handles [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1186.976074] env[65680]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1186.976074] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1186.976074] env[65680]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1186.976074] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1186.976074] env[65680]: ERROR oslo_vmware.rw_handles response.begin() [ 1186.976074] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1186.976074] env[65680]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1186.976074] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1186.976074] env[65680]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1186.976074] env[65680]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1186.976074] env[65680]: ERROR oslo_vmware.rw_handles [ 1186.976841] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Downloaded image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to vmware_temp/59ff6527-61d7-454e-b277-27667dda5d46/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1186.978268] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Caching image {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1186.978504] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Copying Virtual Disk [datastore1] vmware_temp/59ff6527-61d7-454e-b277-27667dda5d46/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk to [datastore1] vmware_temp/59ff6527-61d7-454e-b277-27667dda5d46/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk {{(pid=65680) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1186.978780] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2233401a-25b4-432f-9f5a-f8de80434d70 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1186.987137] env[65680]: DEBUG oslo_vmware.api [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Waiting for the task: (returnval){ [ 1186.987137] env[65680]: value = "task-2847966" [ 1186.987137] env[65680]: _type = "Task" [ 1186.987137] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1186.994508] env[65680]: DEBUG oslo_vmware.api [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Task: {'id': task-2847966, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1187.497351] env[65680]: DEBUG oslo_vmware.exceptions [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Fault InvalidArgument not matched. {{(pid=65680) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1187.497588] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1187.498165] env[65680]: ERROR nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1187.498165] env[65680]: Faults: ['InvalidArgument'] [ 1187.498165] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Traceback (most recent call last): [ 1187.498165] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1187.498165] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] yield resources [ 1187.498165] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1187.498165] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] self.driver.spawn(context, instance, image_meta, [ 1187.498165] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1187.498165] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1187.498165] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1187.498165] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] self._fetch_image_if_missing(context, vi) [ 1187.498165] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1187.498597] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] image_cache(vi, tmp_image_ds_loc) [ 1187.498597] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1187.498597] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] vm_util.copy_virtual_disk( [ 1187.498597] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1187.498597] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] session._wait_for_task(vmdk_copy_task) [ 1187.498597] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1187.498597] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] return self.wait_for_task(task_ref) [ 1187.498597] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1187.498597] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] return evt.wait() [ 1187.498597] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1187.498597] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] result = hub.switch() [ 1187.498597] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1187.498597] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] return self.greenlet.switch() [ 1187.498955] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1187.498955] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] self.f(*self.args, **self.kw) [ 1187.498955] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1187.498955] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] raise exceptions.translate_fault(task_info.error) [ 1187.498955] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1187.498955] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Faults: ['InvalidArgument'] [ 1187.498955] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] [ 1187.498955] env[65680]: INFO nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Terminating instance [ 1187.499981] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1187.500198] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1187.500424] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fab265e0-e015-4ce0-95bf-627de23cc8ee {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1187.502712] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1187.502908] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1187.503615] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-036933e3-aa49-4c57-a2f7-23dc4eaa72e4 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1187.510016] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1187.510220] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a006bf23-460b-4036-888a-2ffe2940fd9b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1187.512260] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1187.512426] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1187.513332] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b1a97e04-337f-4e50-8842-690080a7d78d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1187.518431] env[65680]: DEBUG oslo_vmware.api [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Waiting for the task: (returnval){ [ 1187.518431] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52352a32-ecae-8f72-1a13-b8d5e874d2b3" [ 1187.518431] env[65680]: _type = "Task" [ 1187.518431] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1187.524929] env[65680]: DEBUG oslo_vmware.api [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52352a32-ecae-8f72-1a13-b8d5e874d2b3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1187.580055] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1187.580361] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1187.580461] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Deleting the datastore file [datastore1] 8b747838-fcd0-494c-bd5a-0e5b1950a44e {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1187.580714] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1b8d3e58-a53f-4018-b04a-f5fcdf6e0e98 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1187.587349] env[65680]: DEBUG oslo_vmware.api [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Waiting for the task: (returnval){ [ 1187.587349] env[65680]: value = "task-2847968" [ 1187.587349] env[65680]: _type = "Task" [ 1187.587349] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1187.595226] env[65680]: DEBUG oslo_vmware.api [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Task: {'id': task-2847968, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1188.028843] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1188.029248] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Creating directory with path [datastore1] vmware_temp/088c4fd0-9ab1-40fd-be27-3f68f36dc436/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1188.029335] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6da2848e-df61-4b00-89d4-a67c9dc96bab {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.040090] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Created directory with path [datastore1] vmware_temp/088c4fd0-9ab1-40fd-be27-3f68f36dc436/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1188.040278] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Fetch image to [datastore1] vmware_temp/088c4fd0-9ab1-40fd-be27-3f68f36dc436/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1188.040441] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/088c4fd0-9ab1-40fd-be27-3f68f36dc436/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1188.041118] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d89e004e-76ac-44f9-8808-ee87cc36c000 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.047368] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a637ab9c-6cca-49e8-8b4a-93215cae1d05 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.055923] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2390b26f-1a96-445b-942d-badf4d05a5bb {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.086158] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ce9b3f0-0377-4cde-ada2-280f6681409b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.096259] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-15906098-80f7-4cfc-b24d-43f6adb97d6a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.097823] env[65680]: DEBUG oslo_vmware.api [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Task: {'id': task-2847968, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066303} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1188.098051] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1188.098225] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1188.098389] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1188.098559] env[65680]: INFO nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1188.100869] env[65680]: DEBUG nova.compute.claims [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1188.101044] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1188.101256] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1188.119190] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1188.128107] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.027s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1188.128754] env[65680]: DEBUG nova.compute.utils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Instance 8b747838-fcd0-494c-bd5a-0e5b1950a44e could not be found. {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1188.130481] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Instance disappeared during build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1188.130656] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1188.130815] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1188.130980] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1188.131152] env[65680]: DEBUG nova.network.neutron [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1188.174555] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1188.175379] env[65680]: ERROR nova.compute.manager [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 1188.175379] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Traceback (most recent call last): [ 1188.175379] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1188.175379] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1188.175379] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1188.175379] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] result = getattr(controller, method)(*args, **kwargs) [ 1188.175379] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1188.175379] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return self._get(image_id) [ 1188.175379] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1188.175379] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1188.175379] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1188.175751] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] resp, body = self.http_client.get(url, headers=header) [ 1188.175751] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1188.175751] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return self.request(url, 'GET', **kwargs) [ 1188.175751] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1188.175751] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return self._handle_response(resp) [ 1188.175751] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1188.175751] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] raise exc.from_response(resp, resp.content) [ 1188.175751] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1188.175751] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.175751] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] During handling of the above exception, another exception occurred: [ 1188.175751] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.175751] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Traceback (most recent call last): [ 1188.176069] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1188.176069] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] yield resources [ 1188.176069] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1188.176069] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] self.driver.spawn(context, instance, image_meta, [ 1188.176069] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1188.176069] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1188.176069] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1188.176069] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] self._fetch_image_if_missing(context, vi) [ 1188.176069] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1188.176069] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] image_fetch(context, vi, tmp_image_ds_loc) [ 1188.176069] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1188.176069] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] images.fetch_image( [ 1188.176069] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1188.176408] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] metadata = IMAGE_API.get(context, image_ref) [ 1188.176408] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1188.176408] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return session.show(context, image_id, [ 1188.176408] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1188.176408] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] _reraise_translated_image_exception(image_id) [ 1188.176408] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1188.176408] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] raise new_exc.with_traceback(exc_trace) [ 1188.176408] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1188.176408] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1188.176408] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1188.176408] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] result = getattr(controller, method)(*args, **kwargs) [ 1188.176408] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1188.176408] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return self._get(image_id) [ 1188.176730] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1188.176730] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1188.176730] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1188.176730] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] resp, body = self.http_client.get(url, headers=header) [ 1188.176730] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1188.176730] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return self.request(url, 'GET', **kwargs) [ 1188.176730] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1188.176730] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return self._handle_response(resp) [ 1188.176730] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1188.176730] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] raise exc.from_response(resp, resp.content) [ 1188.176730] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 1188.176730] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.177036] env[65680]: INFO nova.compute.manager [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Terminating instance [ 1188.177368] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1188.177766] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1188.178415] env[65680]: DEBUG nova.compute.manager [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1188.178607] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1188.178830] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-751a4a91-a3b4-4d9c-b9e1-0589a4b592af {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.181253] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3337cd32-9511-4b60-9a4e-b7dd256b7d7b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.188070] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1188.189519] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-85b6f85d-dcba-4196-b157-99d23bb8414c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.191811] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1188.191977] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1188.192879] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-33faa50d-0b33-47cc-840d-16cf84fc6fc0 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.198069] env[65680]: DEBUG oslo_vmware.api [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Waiting for the task: (returnval){ [ 1188.198069] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52139aa0-dd34-46e8-d6f5-f0906d116018" [ 1188.198069] env[65680]: _type = "Task" [ 1188.198069] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1188.205017] env[65680]: DEBUG oslo_vmware.api [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52139aa0-dd34-46e8-d6f5-f0906d116018, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1188.233615] env[65680]: DEBUG neutronclient.v2_0.client [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=65680) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1188.234958] env[65680]: ERROR nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1188.234958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Traceback (most recent call last): [ 1188.234958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1188.234958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] self.driver.spawn(context, instance, image_meta, [ 1188.234958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1188.234958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1188.234958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1188.234958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] self._fetch_image_if_missing(context, vi) [ 1188.234958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1188.234958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] image_cache(vi, tmp_image_ds_loc) [ 1188.234958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1188.234958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] vm_util.copy_virtual_disk( [ 1188.235323] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1188.235323] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] session._wait_for_task(vmdk_copy_task) [ 1188.235323] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1188.235323] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] return self.wait_for_task(task_ref) [ 1188.235323] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1188.235323] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] return evt.wait() [ 1188.235323] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1188.235323] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] result = hub.switch() [ 1188.235323] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1188.235323] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] return self.greenlet.switch() [ 1188.235323] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1188.235323] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] self.f(*self.args, **self.kw) [ 1188.235323] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1188.235768] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] raise exceptions.translate_fault(task_info.error) [ 1188.235768] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1188.235768] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Faults: ['InvalidArgument'] [ 1188.235768] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] [ 1188.235768] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] During handling of the above exception, another exception occurred: [ 1188.235768] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] [ 1188.235768] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Traceback (most recent call last): [ 1188.235768] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1188.235768] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] self._build_and_run_instance(context, instance, image, [ 1188.235768] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1188.235768] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] with excutils.save_and_reraise_exception(): [ 1188.235768] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1188.235768] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] self.force_reraise() [ 1188.235768] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1188.236186] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] raise self.value [ 1188.236186] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1188.236186] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] with self.rt.instance_claim(context, instance, node, allocs, [ 1188.236186] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1188.236186] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] self.abort() [ 1188.236186] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1188.236186] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1188.236186] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1188.236186] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] return f(*args, **kwargs) [ 1188.236186] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1188.236186] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] self._unset_instance_host_and_node(instance) [ 1188.236186] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1188.236186] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] instance.save() [ 1188.236583] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1188.236583] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] updates, result = self.indirection_api.object_action( [ 1188.236583] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1188.236583] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] return cctxt.call(context, 'object_action', objinst=objinst, [ 1188.236583] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1188.236583] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] result = self.transport._send( [ 1188.236583] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1188.236583] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] return self._driver.send(target, ctxt, message, [ 1188.236583] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1188.236583] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1188.236583] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1188.236583] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] raise result [ 1188.236928] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] nova.exception_Remote.InstanceNotFound_Remote: Instance 8b747838-fcd0-494c-bd5a-0e5b1950a44e could not be found. [ 1188.236928] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Traceback (most recent call last): [ 1188.236928] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] [ 1188.236928] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1188.236928] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] return getattr(target, method)(*args, **kwargs) [ 1188.236928] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] [ 1188.236928] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1188.236928] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] return fn(self, *args, **kwargs) [ 1188.236928] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] [ 1188.236928] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1188.236928] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] old_ref, inst_ref = db.instance_update_and_get_original( [ 1188.236928] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] [ 1188.236928] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1188.236928] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] return f(*args, **kwargs) [ 1188.236928] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] [ 1188.237364] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1188.237364] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] with excutils.save_and_reraise_exception() as ectxt: [ 1188.237364] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] [ 1188.237364] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1188.237364] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] self.force_reraise() [ 1188.237364] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] [ 1188.237364] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1188.237364] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] raise self.value [ 1188.237364] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] [ 1188.237364] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1188.237364] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] return f(*args, **kwargs) [ 1188.237364] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] [ 1188.237364] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1188.237364] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] return f(context, *args, **kwargs) [ 1188.237364] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] [ 1188.237767] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1188.237767] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1188.237767] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] [ 1188.237767] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1188.237767] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] raise exception.InstanceNotFound(instance_id=uuid) [ 1188.237767] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] [ 1188.237767] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] nova.exception.InstanceNotFound: Instance 8b747838-fcd0-494c-bd5a-0e5b1950a44e could not be found. [ 1188.237767] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] [ 1188.237767] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] [ 1188.237767] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] During handling of the above exception, another exception occurred: [ 1188.237767] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] [ 1188.237767] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Traceback (most recent call last): [ 1188.237767] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1188.237767] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] ret = obj(*args, **kwargs) [ 1188.237767] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1188.238211] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] exception_handler_v20(status_code, error_body) [ 1188.238211] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1188.238211] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] raise client_exc(message=error_message, [ 1188.238211] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1188.238211] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Neutron server returns request_ids: ['req-a5679848-a698-4c22-8d37-dbac98b95c8d'] [ 1188.238211] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] [ 1188.238211] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] During handling of the above exception, another exception occurred: [ 1188.238211] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] [ 1188.238211] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] Traceback (most recent call last): [ 1188.238211] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1188.238211] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] self._deallocate_network(context, instance, requested_networks) [ 1188.238211] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1188.238211] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] self.network_api.deallocate_for_instance( [ 1188.238591] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1188.238591] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] data = neutron.list_ports(**search_opts) [ 1188.238591] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1188.238591] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] ret = obj(*args, **kwargs) [ 1188.238591] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1188.238591] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] return self.list('ports', self.ports_path, retrieve_all, [ 1188.238591] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1188.238591] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] ret = obj(*args, **kwargs) [ 1188.238591] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1188.238591] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] for r in self._pagination(collection, path, **params): [ 1188.238591] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1188.238591] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] res = self.get(path, params=params) [ 1188.238591] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1188.238958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] ret = obj(*args, **kwargs) [ 1188.238958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1188.238958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] return self.retry_request("GET", action, body=body, [ 1188.238958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1188.238958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] ret = obj(*args, **kwargs) [ 1188.238958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1188.238958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] return self.do_request(method, action, body=body, [ 1188.238958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1188.238958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] ret = obj(*args, **kwargs) [ 1188.238958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1188.238958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] self._handle_fault_response(status_code, replybody, resp) [ 1188.238958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1188.238958] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] raise exception.Unauthorized() [ 1188.239368] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] nova.exception.Unauthorized: Not authorized. [ 1188.239368] env[65680]: ERROR nova.compute.manager [instance: 8b747838-fcd0-494c-bd5a-0e5b1950a44e] [ 1188.257127] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Lock "8b747838-fcd0-494c-bd5a-0e5b1950a44e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 313.362s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1188.265134] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1188.265339] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1188.265543] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Deleting the datastore file [datastore1] c9230f1c-72ea-4f62-be9f-949def49c5f4 {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1188.265792] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-61c7bbc7-fb0d-4349-8e63-3d3cc58b4847 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.272299] env[65680]: DEBUG oslo_vmware.api [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Waiting for the task: (returnval){ [ 1188.272299] env[65680]: value = "task-2847970" [ 1188.272299] env[65680]: _type = "Task" [ 1188.272299] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1188.280258] env[65680]: DEBUG oslo_vmware.api [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Task: {'id': task-2847970, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1188.708980] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1188.709265] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Creating directory with path [datastore1] vmware_temp/627b700a-54bb-4d1a-944b-3b0afc1651aa/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1188.709494] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f9513242-1ea7-42db-9371-d3e54a3b1b97 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.721351] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Created directory with path [datastore1] vmware_temp/627b700a-54bb-4d1a-944b-3b0afc1651aa/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1188.721550] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Fetch image to [datastore1] vmware_temp/627b700a-54bb-4d1a-944b-3b0afc1651aa/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1188.721718] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/627b700a-54bb-4d1a-944b-3b0afc1651aa/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1188.722445] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ac61b21-3a49-4a0c-ba0d-9e787e411f5b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.729226] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f769aaaa-06e7-4e48-9b04-8e5f60097b26 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.737966] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12667215-ff7a-40a4-a62f-37ae121ca078 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.769069] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31eba518-4e5a-4b42-9b5e-ff7b73cd899f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.777262] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-653cc2c4-b969-4464-b6cd-624bf25e09ec {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.781861] env[65680]: DEBUG oslo_vmware.api [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Task: {'id': task-2847970, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06345} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1188.782105] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1188.782286] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1188.782455] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1188.782623] env[65680]: INFO nova.compute.manager [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1188.784861] env[65680]: DEBUG nova.compute.claims [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1188.785049] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1188.785267] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1188.797336] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1188.810106] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1188.810774] env[65680]: DEBUG nova.compute.utils [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Instance c9230f1c-72ea-4f62-be9f-949def49c5f4 could not be found. {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1188.812486] env[65680]: DEBUG nova.compute.manager [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Instance disappeared during build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1188.812660] env[65680]: DEBUG nova.compute.manager [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1188.812856] env[65680]: DEBUG nova.compute.manager [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1188.813050] env[65680]: DEBUG nova.compute.manager [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1188.813217] env[65680]: DEBUG nova.network.neutron [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1188.835920] env[65680]: DEBUG neutronclient.v2_0.client [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=65680) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1188.837484] env[65680]: ERROR nova.compute.manager [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1188.837484] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Traceback (most recent call last): [ 1188.837484] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1188.837484] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1188.837484] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1188.837484] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] result = getattr(controller, method)(*args, **kwargs) [ 1188.837484] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1188.837484] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return self._get(image_id) [ 1188.837484] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1188.837484] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1188.837484] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1188.837484] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] resp, body = self.http_client.get(url, headers=header) [ 1188.837815] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1188.837815] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return self.request(url, 'GET', **kwargs) [ 1188.837815] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1188.837815] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return self._handle_response(resp) [ 1188.837815] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1188.837815] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] raise exc.from_response(resp, resp.content) [ 1188.837815] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1188.837815] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.837815] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] During handling of the above exception, another exception occurred: [ 1188.837815] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.837815] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Traceback (most recent call last): [ 1188.837815] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1188.838133] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] self.driver.spawn(context, instance, image_meta, [ 1188.838133] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1188.838133] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1188.838133] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1188.838133] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] self._fetch_image_if_missing(context, vi) [ 1188.838133] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1188.838133] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] image_fetch(context, vi, tmp_image_ds_loc) [ 1188.838133] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1188.838133] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] images.fetch_image( [ 1188.838133] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1188.838133] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] metadata = IMAGE_API.get(context, image_ref) [ 1188.838133] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1188.838133] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return session.show(context, image_id, [ 1188.838485] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1188.838485] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] _reraise_translated_image_exception(image_id) [ 1188.838485] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1188.838485] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] raise new_exc.with_traceback(exc_trace) [ 1188.838485] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1188.838485] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1188.838485] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1188.838485] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] result = getattr(controller, method)(*args, **kwargs) [ 1188.838485] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1188.838485] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return self._get(image_id) [ 1188.838485] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1188.838485] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1188.838485] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1188.838807] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] resp, body = self.http_client.get(url, headers=header) [ 1188.838807] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1188.838807] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return self.request(url, 'GET', **kwargs) [ 1188.838807] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1188.838807] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return self._handle_response(resp) [ 1188.838807] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1188.838807] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] raise exc.from_response(resp, resp.content) [ 1188.838807] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 1188.838807] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.838807] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] During handling of the above exception, another exception occurred: [ 1188.838807] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.838807] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Traceback (most recent call last): [ 1188.838807] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1188.839117] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] self._build_and_run_instance(context, instance, image, [ 1188.839117] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1188.839117] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] with excutils.save_and_reraise_exception(): [ 1188.839117] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1188.839117] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] self.force_reraise() [ 1188.839117] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1188.839117] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] raise self.value [ 1188.839117] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1188.839117] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] with self.rt.instance_claim(context, instance, node, allocs, [ 1188.839117] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1188.839117] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] self.abort() [ 1188.839117] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1188.839117] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1188.839488] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1188.839488] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return f(*args, **kwargs) [ 1188.839488] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1188.839488] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] self._unset_instance_host_and_node(instance) [ 1188.839488] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1188.839488] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] instance.save() [ 1188.839488] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1188.839488] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] updates, result = self.indirection_api.object_action( [ 1188.839488] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1188.839488] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return cctxt.call(context, 'object_action', objinst=objinst, [ 1188.839488] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1188.839488] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] result = self.transport._send( [ 1188.839848] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1188.839848] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return self._driver.send(target, ctxt, message, [ 1188.839848] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1188.839848] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1188.839848] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1188.839848] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] raise result [ 1188.839848] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] nova.exception_Remote.InstanceNotFound_Remote: Instance c9230f1c-72ea-4f62-be9f-949def49c5f4 could not be found. [ 1188.839848] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Traceback (most recent call last): [ 1188.839848] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.839848] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1188.839848] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return getattr(target, method)(*args, **kwargs) [ 1188.839848] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.839848] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1188.840250] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return fn(self, *args, **kwargs) [ 1188.840250] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.840250] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1188.840250] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] old_ref, inst_ref = db.instance_update_and_get_original( [ 1188.840250] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.840250] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1188.840250] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return f(*args, **kwargs) [ 1188.840250] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.840250] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1188.840250] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] with excutils.save_and_reraise_exception() as ectxt: [ 1188.840250] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.840250] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1188.840250] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] self.force_reraise() [ 1188.840250] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.840250] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1188.840681] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] raise self.value [ 1188.840681] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.840681] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1188.840681] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return f(*args, **kwargs) [ 1188.840681] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.840681] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1188.840681] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return f(context, *args, **kwargs) [ 1188.840681] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.840681] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1188.840681] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1188.840681] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.840681] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1188.840681] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] raise exception.InstanceNotFound(instance_id=uuid) [ 1188.840681] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.840681] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] nova.exception.InstanceNotFound: Instance c9230f1c-72ea-4f62-be9f-949def49c5f4 could not be found. [ 1188.841186] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.841186] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.841186] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] During handling of the above exception, another exception occurred: [ 1188.841186] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.841186] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Traceback (most recent call last): [ 1188.841186] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1188.841186] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] ret = obj(*args, **kwargs) [ 1188.841186] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1188.841186] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] exception_handler_v20(status_code, error_body) [ 1188.841186] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1188.841186] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] raise client_exc(message=error_message, [ 1188.841186] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1188.841186] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Neutron server returns request_ids: ['req-c295aff1-d9a7-4815-afde-3f0041c16532'] [ 1188.841186] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.841601] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] During handling of the above exception, another exception occurred: [ 1188.841601] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.841601] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] Traceback (most recent call last): [ 1188.841601] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1188.841601] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] self._deallocate_network(context, instance, requested_networks) [ 1188.841601] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1188.841601] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] self.network_api.deallocate_for_instance( [ 1188.841601] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1188.841601] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] data = neutron.list_ports(**search_opts) [ 1188.841601] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1188.841601] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] ret = obj(*args, **kwargs) [ 1188.841601] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1188.841601] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return self.list('ports', self.ports_path, retrieve_all, [ 1188.841993] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1188.841993] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] ret = obj(*args, **kwargs) [ 1188.841993] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1188.841993] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] for r in self._pagination(collection, path, **params): [ 1188.841993] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1188.841993] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] res = self.get(path, params=params) [ 1188.841993] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1188.841993] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] ret = obj(*args, **kwargs) [ 1188.841993] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1188.841993] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return self.retry_request("GET", action, body=body, [ 1188.841993] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1188.841993] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] ret = obj(*args, **kwargs) [ 1188.841993] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1188.842438] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] return self.do_request(method, action, body=body, [ 1188.842438] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1188.842438] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] ret = obj(*args, **kwargs) [ 1188.842438] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1188.842438] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] self._handle_fault_response(status_code, replybody, resp) [ 1188.842438] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1188.842438] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] raise exception.Unauthorized() [ 1188.842438] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] nova.exception.Unauthorized: Not authorized. [ 1188.842438] env[65680]: ERROR nova.compute.manager [instance: c9230f1c-72ea-4f62-be9f-949def49c5f4] [ 1188.861288] env[65680]: DEBUG oslo_concurrency.lockutils [None req-2d60280d-829b-4995-8ae7-a556f8a1d6ea tempest-ServerRescueTestJSON-1885332325 tempest-ServerRescueTestJSON-1885332325-project-member] Lock "c9230f1c-72ea-4f62-be9f-949def49c5f4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 309.577s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1188.890287] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1188.891105] env[65680]: ERROR nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 1188.891105] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Traceback (most recent call last): [ 1188.891105] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1188.891105] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1188.891105] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1188.891105] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] result = getattr(controller, method)(*args, **kwargs) [ 1188.891105] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1188.891105] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return self._get(image_id) [ 1188.891105] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1188.891105] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1188.891105] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1188.891413] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] resp, body = self.http_client.get(url, headers=header) [ 1188.891413] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1188.891413] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return self.request(url, 'GET', **kwargs) [ 1188.891413] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1188.891413] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return self._handle_response(resp) [ 1188.891413] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1188.891413] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] raise exc.from_response(resp, resp.content) [ 1188.891413] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1188.891413] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1188.891413] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] During handling of the above exception, another exception occurred: [ 1188.891413] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1188.891413] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Traceback (most recent call last): [ 1188.891727] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1188.891727] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] yield resources [ 1188.891727] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1188.891727] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] self.driver.spawn(context, instance, image_meta, [ 1188.891727] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1188.891727] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1188.891727] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1188.891727] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] self._fetch_image_if_missing(context, vi) [ 1188.891727] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1188.891727] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] image_fetch(context, vi, tmp_image_ds_loc) [ 1188.891727] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1188.891727] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] images.fetch_image( [ 1188.891727] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1188.892071] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] metadata = IMAGE_API.get(context, image_ref) [ 1188.892071] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1188.892071] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return session.show(context, image_id, [ 1188.892071] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1188.892071] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] _reraise_translated_image_exception(image_id) [ 1188.892071] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1188.892071] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] raise new_exc.with_traceback(exc_trace) [ 1188.892071] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1188.892071] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1188.892071] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1188.892071] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] result = getattr(controller, method)(*args, **kwargs) [ 1188.892071] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1188.892071] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return self._get(image_id) [ 1188.892393] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1188.892393] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1188.892393] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1188.892393] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] resp, body = self.http_client.get(url, headers=header) [ 1188.892393] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1188.892393] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return self.request(url, 'GET', **kwargs) [ 1188.892393] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1188.892393] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return self._handle_response(resp) [ 1188.892393] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1188.892393] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] raise exc.from_response(resp, resp.content) [ 1188.892393] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 1188.892393] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1188.892677] env[65680]: INFO nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Terminating instance [ 1188.892865] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1188.893084] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1188.893333] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-aed85bd3-3d71-4894-bea2-80fb57f55c1a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.896922] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1188.897142] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1188.897944] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d171a51a-ba27-466e-9320-d4e02e370b8b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.901463] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1188.901636] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1188.902597] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4eb24d5b-686a-4bdf-85f2-7e6c2d648943 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.906645] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1188.907116] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3c01b9fd-d584-4e16-b9f6-58584ec8e5c7 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.909433] env[65680]: DEBUG oslo_vmware.api [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Waiting for the task: (returnval){ [ 1188.909433] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52a4d3f0-628c-1b70-3b07-072c1843bdac" [ 1188.909433] env[65680]: _type = "Task" [ 1188.909433] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1188.916867] env[65680]: DEBUG oslo_vmware.api [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52a4d3f0-628c-1b70-3b07-072c1843bdac, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1188.969342] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1188.969582] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1188.969693] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Deleting the datastore file [datastore1] 01e82211-1de5-44ad-b14e-81a54470d4e5 {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1188.969926] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-23a7f253-57cf-4e69-9b9c-50723958aa1f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.976322] env[65680]: DEBUG oslo_vmware.api [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Waiting for the task: (returnval){ [ 1188.976322] env[65680]: value = "task-2847972" [ 1188.976322] env[65680]: _type = "Task" [ 1188.976322] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1188.983770] env[65680]: DEBUG oslo_vmware.api [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Task: {'id': task-2847972, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1189.419337] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1189.419691] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Creating directory with path [datastore1] vmware_temp/ecc9d086-cf50-40d1-822b-02ff0bc9dbd1/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1189.419740] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6500e8e7-588f-47bf-81cf-1c0a554291ed {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1189.430288] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Created directory with path [datastore1] vmware_temp/ecc9d086-cf50-40d1-822b-02ff0bc9dbd1/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1189.430471] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Fetch image to [datastore1] vmware_temp/ecc9d086-cf50-40d1-822b-02ff0bc9dbd1/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1189.430637] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/ecc9d086-cf50-40d1-822b-02ff0bc9dbd1/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1189.431326] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c15be06-452d-486e-a960-e448e7c76079 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1189.437792] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8995068b-512c-4cdb-8090-77ceb42ece80 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1189.447465] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a235f35e-3f68-427b-b9c2-a08b052868c6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1189.476704] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a82fa3a1-a241-490e-94b9-7a247dc6ca4e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1189.487099] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-21df5cef-611c-47c9-8789-e826033feafb {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1189.488723] env[65680]: DEBUG oslo_vmware.api [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Task: {'id': task-2847972, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.0762} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1189.488950] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1189.489141] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1189.489310] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1189.489479] env[65680]: INFO nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1189.491533] env[65680]: DEBUG nova.compute.claims [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1189.491704] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1189.491916] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1189.515939] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1189.518773] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.027s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1189.519398] env[65680]: DEBUG nova.compute.utils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Instance 01e82211-1de5-44ad-b14e-81a54470d4e5 could not be found. {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1189.520753] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Instance disappeared during build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1189.520925] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1189.521100] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1189.521268] env[65680]: DEBUG nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1189.521423] env[65680]: DEBUG nova.network.neutron [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1189.545129] env[65680]: DEBUG neutronclient.v2_0.client [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=65680) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1189.546576] env[65680]: ERROR nova.compute.manager [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1189.546576] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Traceback (most recent call last): [ 1189.546576] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1189.546576] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1189.546576] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1189.546576] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] result = getattr(controller, method)(*args, **kwargs) [ 1189.546576] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1189.546576] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return self._get(image_id) [ 1189.546576] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1189.546576] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1189.546576] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1189.546576] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] resp, body = self.http_client.get(url, headers=header) [ 1189.547131] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1189.547131] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return self.request(url, 'GET', **kwargs) [ 1189.547131] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1189.547131] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return self._handle_response(resp) [ 1189.547131] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1189.547131] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] raise exc.from_response(resp, resp.content) [ 1189.547131] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1189.547131] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.547131] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] During handling of the above exception, another exception occurred: [ 1189.547131] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.547131] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Traceback (most recent call last): [ 1189.547131] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1189.547615] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] self.driver.spawn(context, instance, image_meta, [ 1189.547615] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1189.547615] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1189.547615] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1189.547615] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] self._fetch_image_if_missing(context, vi) [ 1189.547615] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1189.547615] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] image_fetch(context, vi, tmp_image_ds_loc) [ 1189.547615] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1189.547615] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] images.fetch_image( [ 1189.547615] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1189.547615] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] metadata = IMAGE_API.get(context, image_ref) [ 1189.547615] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1189.547615] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return session.show(context, image_id, [ 1189.548043] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1189.548043] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] _reraise_translated_image_exception(image_id) [ 1189.548043] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1189.548043] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] raise new_exc.with_traceback(exc_trace) [ 1189.548043] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1189.548043] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1189.548043] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1189.548043] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] result = getattr(controller, method)(*args, **kwargs) [ 1189.548043] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1189.548043] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return self._get(image_id) [ 1189.548043] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1189.548043] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1189.548043] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1189.548482] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] resp, body = self.http_client.get(url, headers=header) [ 1189.548482] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1189.548482] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return self.request(url, 'GET', **kwargs) [ 1189.548482] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1189.548482] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return self._handle_response(resp) [ 1189.548482] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1189.548482] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] raise exc.from_response(resp, resp.content) [ 1189.548482] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 1189.548482] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.548482] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] During handling of the above exception, another exception occurred: [ 1189.548482] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.548482] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Traceback (most recent call last): [ 1189.548482] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1189.548841] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] self._build_and_run_instance(context, instance, image, [ 1189.548841] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1189.548841] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] with excutils.save_and_reraise_exception(): [ 1189.548841] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1189.548841] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] self.force_reraise() [ 1189.548841] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1189.548841] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] raise self.value [ 1189.548841] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1189.548841] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] with self.rt.instance_claim(context, instance, node, allocs, [ 1189.548841] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1189.548841] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] self.abort() [ 1189.548841] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1189.548841] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1189.549236] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1189.549236] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return f(*args, **kwargs) [ 1189.549236] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1189.549236] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] self._unset_instance_host_and_node(instance) [ 1189.549236] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1189.549236] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] instance.save() [ 1189.549236] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1189.549236] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] updates, result = self.indirection_api.object_action( [ 1189.549236] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1189.549236] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return cctxt.call(context, 'object_action', objinst=objinst, [ 1189.549236] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1189.549236] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] result = self.transport._send( [ 1189.549612] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1189.549612] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return self._driver.send(target, ctxt, message, [ 1189.549612] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1189.549612] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1189.549612] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1189.549612] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] raise result [ 1189.549612] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] nova.exception_Remote.InstanceNotFound_Remote: Instance 01e82211-1de5-44ad-b14e-81a54470d4e5 could not be found. [ 1189.549612] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Traceback (most recent call last): [ 1189.549612] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.549612] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1189.549612] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return getattr(target, method)(*args, **kwargs) [ 1189.549612] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.549612] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1189.549984] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return fn(self, *args, **kwargs) [ 1189.549984] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.549984] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1189.549984] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] old_ref, inst_ref = db.instance_update_and_get_original( [ 1189.549984] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.549984] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1189.549984] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return f(*args, **kwargs) [ 1189.549984] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.549984] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1189.549984] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] with excutils.save_and_reraise_exception() as ectxt: [ 1189.549984] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.549984] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1189.549984] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] self.force_reraise() [ 1189.549984] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.549984] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1189.550816] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] raise self.value [ 1189.550816] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.550816] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1189.550816] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return f(*args, **kwargs) [ 1189.550816] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.550816] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1189.550816] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return f(context, *args, **kwargs) [ 1189.550816] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.550816] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1189.550816] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1189.550816] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.550816] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1189.550816] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] raise exception.InstanceNotFound(instance_id=uuid) [ 1189.550816] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.550816] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] nova.exception.InstanceNotFound: Instance 01e82211-1de5-44ad-b14e-81a54470d4e5 could not be found. [ 1189.551459] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.551459] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.551459] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] During handling of the above exception, another exception occurred: [ 1189.551459] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.551459] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Traceback (most recent call last): [ 1189.551459] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.551459] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] ret = obj(*args, **kwargs) [ 1189.551459] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1189.551459] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] exception_handler_v20(status_code, error_body) [ 1189.551459] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1189.551459] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] raise client_exc(message=error_message, [ 1189.551459] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1189.551459] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Neutron server returns request_ids: ['req-9cd3ae21-46c5-4be9-8e74-081ed54ea85e'] [ 1189.551459] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.552016] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] During handling of the above exception, another exception occurred: [ 1189.552016] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.552016] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] Traceback (most recent call last): [ 1189.552016] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1189.552016] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] self._deallocate_network(context, instance, requested_networks) [ 1189.552016] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1189.552016] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] self.network_api.deallocate_for_instance( [ 1189.552016] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1189.552016] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] data = neutron.list_ports(**search_opts) [ 1189.552016] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.552016] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] ret = obj(*args, **kwargs) [ 1189.552016] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1189.552016] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return self.list('ports', self.ports_path, retrieve_all, [ 1189.552543] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.552543] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] ret = obj(*args, **kwargs) [ 1189.552543] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1189.552543] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] for r in self._pagination(collection, path, **params): [ 1189.552543] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1189.552543] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] res = self.get(path, params=params) [ 1189.552543] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.552543] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] ret = obj(*args, **kwargs) [ 1189.552543] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1189.552543] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return self.retry_request("GET", action, body=body, [ 1189.552543] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.552543] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] ret = obj(*args, **kwargs) [ 1189.552543] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1189.552999] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] return self.do_request(method, action, body=body, [ 1189.552999] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.552999] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] ret = obj(*args, **kwargs) [ 1189.552999] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1189.552999] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] self._handle_fault_response(status_code, replybody, resp) [ 1189.552999] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1189.552999] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] raise exception.Unauthorized() [ 1189.552999] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] nova.exception.Unauthorized: Not authorized. [ 1189.552999] env[65680]: ERROR nova.compute.manager [instance: 01e82211-1de5-44ad-b14e-81a54470d4e5] [ 1189.568814] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1c5ff9cf-a7b2-45c6-9866-a6dc4c6727f3 tempest-MultipleCreateTestJSON-1277418426 tempest-MultipleCreateTestJSON-1277418426-project-member] Lock "01e82211-1de5-44ad-b14e-81a54470d4e5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 314.647s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1189.604935] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1189.605716] env[65680]: ERROR nova.compute.manager [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 1189.605716] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] Traceback (most recent call last): [ 1189.605716] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1189.605716] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1189.605716] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1189.605716] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] result = getattr(controller, method)(*args, **kwargs) [ 1189.605716] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1189.605716] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return self._get(image_id) [ 1189.605716] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1189.605716] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1189.605716] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1189.606159] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] resp, body = self.http_client.get(url, headers=header) [ 1189.606159] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1189.606159] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return self.request(url, 'GET', **kwargs) [ 1189.606159] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1189.606159] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return self._handle_response(resp) [ 1189.606159] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1189.606159] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] raise exc.from_response(resp, resp.content) [ 1189.606159] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1189.606159] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1189.606159] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] During handling of the above exception, another exception occurred: [ 1189.606159] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1189.606159] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] Traceback (most recent call last): [ 1189.606546] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1189.606546] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] yield resources [ 1189.606546] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1189.606546] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] self.driver.spawn(context, instance, image_meta, [ 1189.606546] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1189.606546] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1189.606546] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1189.606546] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] self._fetch_image_if_missing(context, vi) [ 1189.606546] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1189.606546] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] image_fetch(context, vi, tmp_image_ds_loc) [ 1189.606546] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1189.606546] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] images.fetch_image( [ 1189.606546] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1189.606936] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] metadata = IMAGE_API.get(context, image_ref) [ 1189.606936] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1189.606936] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return session.show(context, image_id, [ 1189.606936] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1189.606936] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] _reraise_translated_image_exception(image_id) [ 1189.606936] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1189.606936] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] raise new_exc.with_traceback(exc_trace) [ 1189.606936] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1189.606936] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1189.606936] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1189.606936] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] result = getattr(controller, method)(*args, **kwargs) [ 1189.606936] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1189.606936] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return self._get(image_id) [ 1189.607317] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1189.607317] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1189.607317] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1189.607317] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] resp, body = self.http_client.get(url, headers=header) [ 1189.607317] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1189.607317] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return self.request(url, 'GET', **kwargs) [ 1189.607317] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1189.607317] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return self._handle_response(resp) [ 1189.607317] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1189.607317] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] raise exc.from_response(resp, resp.content) [ 1189.607317] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 1189.607317] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1189.607650] env[65680]: INFO nova.compute.manager [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Terminating instance [ 1189.607650] env[65680]: DEBUG oslo_concurrency.lockutils [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1189.607650] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1189.608208] env[65680]: DEBUG nova.compute.manager [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1189.608393] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1189.608616] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-24865c26-c172-4d2d-a09a-69b56c369424 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1189.611396] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cde6bc9-b5e4-4bf4-b8c6-f676cbf0d9b8 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1189.617906] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1189.618129] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6c32fe79-22d6-4b9f-8377-2c919e5fa874 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1189.620253] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1189.620423] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1189.621342] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4bb135f5-e6e8-4fb2-81c6-4a44fd0e0651 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1189.626691] env[65680]: DEBUG oslo_vmware.api [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Waiting for the task: (returnval){ [ 1189.626691] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52cc6c96-e6dc-cfc0-6259-d8ee0a204f35" [ 1189.626691] env[65680]: _type = "Task" [ 1189.626691] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1189.633408] env[65680]: DEBUG oslo_vmware.api [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52cc6c96-e6dc-cfc0-6259-d8ee0a204f35, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1189.676069] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1189.676235] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1189.676412] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Deleting the datastore file [datastore1] 132e6039-55dc-4118-bcd5-d32557743981 {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1189.676695] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7ebd2ac7-718f-4745-86ca-60608207a707 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1189.682550] env[65680]: DEBUG oslo_vmware.api [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Waiting for the task: (returnval){ [ 1189.682550] env[65680]: value = "task-2847974" [ 1189.682550] env[65680]: _type = "Task" [ 1189.682550] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1189.689810] env[65680]: DEBUG oslo_vmware.api [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Task: {'id': task-2847974, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1190.136603] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1190.137353] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Creating directory with path [datastore1] vmware_temp/07877566-f667-4ba5-97b2-dad2ccb0b94e/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1190.137353] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cbe17095-eeca-4f1a-9216-626477c91633 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1190.148621] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Created directory with path [datastore1] vmware_temp/07877566-f667-4ba5-97b2-dad2ccb0b94e/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1190.148820] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Fetch image to [datastore1] vmware_temp/07877566-f667-4ba5-97b2-dad2ccb0b94e/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1190.148985] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/07877566-f667-4ba5-97b2-dad2ccb0b94e/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1190.149710] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccfdfb17-a587-4fec-ac1d-f4e79c31ba6e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1190.157587] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3289256e-663b-451d-9e4a-c12a1052d90f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1190.166251] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3fdfb85-5b90-4e85-8fca-fe61821084d7 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1190.199046] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f95834e1-1e84-4fd6-9e62-712fd0e7f284 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1190.205836] env[65680]: DEBUG oslo_vmware.api [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Task: {'id': task-2847974, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072292} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1190.207313] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1190.207497] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1190.207667] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1190.207844] env[65680]: INFO nova.compute.manager [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1190.209554] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-aee4c549-60f5-4aa1-bf38-ef0553dcba12 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1190.211327] env[65680]: DEBUG nova.compute.claims [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1190.211496] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1190.211703] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1190.232705] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1190.237773] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.026s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1190.238415] env[65680]: DEBUG nova.compute.utils [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Instance 132e6039-55dc-4118-bcd5-d32557743981 could not be found. {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1190.239706] env[65680]: DEBUG nova.compute.manager [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Instance disappeared during build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1190.239874] env[65680]: DEBUG nova.compute.manager [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1190.240043] env[65680]: DEBUG nova.compute.manager [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1190.240211] env[65680]: DEBUG nova.compute.manager [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1190.240364] env[65680]: DEBUG nova.network.neutron [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1190.320640] env[65680]: DEBUG oslo_vmware.rw_handles [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/07877566-f667-4ba5-97b2-dad2ccb0b94e/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1190.374176] env[65680]: DEBUG neutronclient.v2_0.client [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=65680) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1190.376450] env[65680]: ERROR nova.compute.manager [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] [instance: 132e6039-55dc-4118-bcd5-d32557743981] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1190.376450] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] Traceback (most recent call last): [ 1190.376450] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1190.376450] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1190.376450] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1190.376450] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] result = getattr(controller, method)(*args, **kwargs) [ 1190.376450] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1190.376450] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return self._get(image_id) [ 1190.376450] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1190.376450] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1190.376450] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1190.376450] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] resp, body = self.http_client.get(url, headers=header) [ 1190.376855] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1190.376855] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return self.request(url, 'GET', **kwargs) [ 1190.376855] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1190.376855] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return self._handle_response(resp) [ 1190.376855] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1190.376855] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] raise exc.from_response(resp, resp.content) [ 1190.376855] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1190.376855] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.376855] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] During handling of the above exception, another exception occurred: [ 1190.376855] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.376855] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] Traceback (most recent call last): [ 1190.376855] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1190.377217] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] self.driver.spawn(context, instance, image_meta, [ 1190.377217] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1190.377217] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1190.377217] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1190.377217] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] self._fetch_image_if_missing(context, vi) [ 1190.377217] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1190.377217] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] image_fetch(context, vi, tmp_image_ds_loc) [ 1190.377217] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1190.377217] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] images.fetch_image( [ 1190.377217] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1190.377217] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] metadata = IMAGE_API.get(context, image_ref) [ 1190.377217] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1190.377217] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return session.show(context, image_id, [ 1190.377569] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1190.377569] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] _reraise_translated_image_exception(image_id) [ 1190.377569] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1190.377569] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] raise new_exc.with_traceback(exc_trace) [ 1190.377569] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1190.377569] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1190.377569] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1190.377569] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] result = getattr(controller, method)(*args, **kwargs) [ 1190.377569] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1190.377569] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return self._get(image_id) [ 1190.377569] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1190.377569] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1190.377569] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1190.377941] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] resp, body = self.http_client.get(url, headers=header) [ 1190.377941] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1190.377941] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return self.request(url, 'GET', **kwargs) [ 1190.377941] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1190.377941] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return self._handle_response(resp) [ 1190.377941] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1190.377941] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] raise exc.from_response(resp, resp.content) [ 1190.377941] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] nova.exception.ImageNotAuthorized: Not authorized for image 43113302-7f85-4bd9-95eb-c8e71f92d770. [ 1190.377941] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.377941] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] During handling of the above exception, another exception occurred: [ 1190.377941] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.377941] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] Traceback (most recent call last): [ 1190.377941] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1190.378311] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] self._build_and_run_instance(context, instance, image, [ 1190.378311] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1190.378311] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] with excutils.save_and_reraise_exception(): [ 1190.378311] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1190.378311] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] self.force_reraise() [ 1190.378311] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1190.378311] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] raise self.value [ 1190.378311] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1190.378311] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] with self.rt.instance_claim(context, instance, node, allocs, [ 1190.378311] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1190.378311] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] self.abort() [ 1190.378311] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1190.378311] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1190.378710] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1190.378710] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return f(*args, **kwargs) [ 1190.378710] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1190.378710] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] self._unset_instance_host_and_node(instance) [ 1190.378710] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1190.378710] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] instance.save() [ 1190.378710] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1190.378710] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] updates, result = self.indirection_api.object_action( [ 1190.378710] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1190.378710] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return cctxt.call(context, 'object_action', objinst=objinst, [ 1190.378710] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1190.378710] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] result = self.transport._send( [ 1190.379088] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1190.379088] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return self._driver.send(target, ctxt, message, [ 1190.379088] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1190.379088] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1190.379088] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1190.379088] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] raise result [ 1190.379088] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] nova.exception_Remote.InstanceNotFound_Remote: Instance 132e6039-55dc-4118-bcd5-d32557743981 could not be found. [ 1190.379088] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] Traceback (most recent call last): [ 1190.379088] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.379088] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1190.379088] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return getattr(target, method)(*args, **kwargs) [ 1190.379088] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.379088] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1190.379522] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return fn(self, *args, **kwargs) [ 1190.379522] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.379522] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1190.379522] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] old_ref, inst_ref = db.instance_update_and_get_original( [ 1190.379522] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.379522] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1190.379522] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return f(*args, **kwargs) [ 1190.379522] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.379522] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1190.379522] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] with excutils.save_and_reraise_exception() as ectxt: [ 1190.379522] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.379522] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1190.379522] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] self.force_reraise() [ 1190.379522] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.379522] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1190.379932] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] raise self.value [ 1190.379932] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.379932] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1190.379932] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return f(*args, **kwargs) [ 1190.379932] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.379932] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1190.379932] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return f(context, *args, **kwargs) [ 1190.379932] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.379932] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1190.379932] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1190.379932] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.379932] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1190.379932] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] raise exception.InstanceNotFound(instance_id=uuid) [ 1190.379932] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.379932] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] nova.exception.InstanceNotFound: Instance 132e6039-55dc-4118-bcd5-d32557743981 could not be found. [ 1190.380367] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.380367] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.380367] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] During handling of the above exception, another exception occurred: [ 1190.380367] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.380367] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] Traceback (most recent call last): [ 1190.380367] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1190.380367] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] ret = obj(*args, **kwargs) [ 1190.380367] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1190.380367] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] exception_handler_v20(status_code, error_body) [ 1190.380367] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1190.380367] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] raise client_exc(message=error_message, [ 1190.380367] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1190.380367] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] Neutron server returns request_ids: ['req-0ad6e394-5f28-49c1-bfa3-40f886d2f4bd'] [ 1190.380367] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.380772] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] During handling of the above exception, another exception occurred: [ 1190.380772] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.380772] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] Traceback (most recent call last): [ 1190.380772] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1190.380772] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] self._deallocate_network(context, instance, requested_networks) [ 1190.380772] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1190.380772] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] self.network_api.deallocate_for_instance( [ 1190.380772] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1190.380772] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] data = neutron.list_ports(**search_opts) [ 1190.380772] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1190.380772] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] ret = obj(*args, **kwargs) [ 1190.380772] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1190.380772] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return self.list('ports', self.ports_path, retrieve_all, [ 1190.381162] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1190.381162] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] ret = obj(*args, **kwargs) [ 1190.381162] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1190.381162] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] for r in self._pagination(collection, path, **params): [ 1190.381162] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1190.381162] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] res = self.get(path, params=params) [ 1190.381162] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1190.381162] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] ret = obj(*args, **kwargs) [ 1190.381162] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1190.381162] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return self.retry_request("GET", action, body=body, [ 1190.381162] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1190.381162] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] ret = obj(*args, **kwargs) [ 1190.381162] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1190.381530] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] return self.do_request(method, action, body=body, [ 1190.381530] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1190.381530] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] ret = obj(*args, **kwargs) [ 1190.381530] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1190.381530] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] self._handle_fault_response(status_code, replybody, resp) [ 1190.381530] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1190.381530] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] raise exception.Unauthorized() [ 1190.381530] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] nova.exception.Unauthorized: Not authorized. [ 1190.381530] env[65680]: ERROR nova.compute.manager [instance: 132e6039-55dc-4118-bcd5-d32557743981] [ 1190.381530] env[65680]: DEBUG oslo_vmware.rw_handles [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Completed reading data from the image iterator. {{(pid=65680) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1190.381809] env[65680]: DEBUG oslo_vmware.rw_handles [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/07877566-f667-4ba5-97b2-dad2ccb0b94e/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1190.397539] env[65680]: DEBUG oslo_concurrency.lockutils [None req-1fc700d7-962a-49c5-989d-2f7241290fd0 tempest-ServersTestFqdnHostnames-550917673 tempest-ServersTestFqdnHostnames-550917673-project-member] Lock "132e6039-55dc-4118-bcd5-d32557743981" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 310.363s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1200.183077] env[65680]: DEBUG nova.compute.manager [req-38d2f290-b5ba-467d-b788-516c96be9be8 req-853cd282-3fba-4d83-ae39-905ab36e2c3b service nova] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Received event network-vif-deleted-e890ed3f-45ac-4f3e-9611-61b45d4951d2 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1226.507664] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1226.507964] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1230.294264] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1230.294552] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Starting heal instance info cache {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1230.294590] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Rebuilding the list of instances to heal {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1230.304892] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1230.305057] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Didn't find any instances for network info cache update. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1233.292724] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1233.293072] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65680) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1234.293506] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1235.293256] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1235.293497] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1236.293640] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager.update_available_resource {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1236.303722] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1236.303956] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1236.304148] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1236.304309] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65680) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1236.305430] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-515e23db-e221-4eb8-ab7e-cd01b1054432 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1236.314123] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3a54513-d9f3-4921-8623-93aeda39f779 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1236.327536] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8eb59f1d-71b0-473d-aafe-b47a2befd916 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1236.333405] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-939f9d29-59ca-4cbb-885b-2fd5c6f4106d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1236.362066] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181084MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65680) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1236.362066] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1236.362066] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1236.400537] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance dac6ccec-b1a2-47d5-9750-d5f59f6743ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1236.400730] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1236.400871] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1236.424898] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8307dd2-cf1a-410f-9a45-c8ba02a50e0d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1236.431981] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95fdc3c6-00f6-4a7c-a254-7d0a01070c6e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1237.082501] env[65680]: WARNING oslo_vmware.rw_handles [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1237.082501] env[65680]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1237.082501] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1237.082501] env[65680]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1237.082501] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1237.082501] env[65680]: ERROR oslo_vmware.rw_handles response.begin() [ 1237.082501] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1237.082501] env[65680]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1237.082501] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1237.082501] env[65680]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1237.082501] env[65680]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1237.082501] env[65680]: ERROR oslo_vmware.rw_handles [ 1237.083221] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Downloaded image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to vmware_temp/07877566-f667-4ba5-97b2-dad2ccb0b94e/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1237.084687] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Caching image {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1237.084941] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Copying Virtual Disk [datastore1] vmware_temp/07877566-f667-4ba5-97b2-dad2ccb0b94e/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk to [datastore1] vmware_temp/07877566-f667-4ba5-97b2-dad2ccb0b94e/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk {{(pid=65680) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1237.085690] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00616a70-8070-447d-8e59-793dcefeca1a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1237.088290] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0bda37e5-dc51-45ee-b5c4-e01383f58610 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1237.095189] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0a9b118-3e79-4cce-a23e-c85eed52cd4f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1237.099897] env[65680]: DEBUG oslo_vmware.api [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Waiting for the task: (returnval){ [ 1237.099897] env[65680]: value = "task-2847975" [ 1237.099897] env[65680]: _type = "Task" [ 1237.099897] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1237.110182] env[65680]: DEBUG nova.compute.provider_tree [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1237.116047] env[65680]: DEBUG oslo_vmware.api [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Task: {'id': task-2847975, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1237.118908] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1237.131809] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65680) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1237.131984] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.770s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1237.611495] env[65680]: DEBUG oslo_vmware.exceptions [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Fault InvalidArgument not matched. {{(pid=65680) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1237.611931] env[65680]: DEBUG oslo_concurrency.lockutils [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1237.612306] env[65680]: ERROR nova.compute.manager [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1237.612306] env[65680]: Faults: ['InvalidArgument'] [ 1237.612306] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Traceback (most recent call last): [ 1237.612306] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1237.612306] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] yield resources [ 1237.612306] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1237.612306] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] self.driver.spawn(context, instance, image_meta, [ 1237.612306] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1237.612306] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1237.612306] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1237.612306] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] self._fetch_image_if_missing(context, vi) [ 1237.612306] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1237.613020] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] image_cache(vi, tmp_image_ds_loc) [ 1237.613020] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1237.613020] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] vm_util.copy_virtual_disk( [ 1237.613020] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1237.613020] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] session._wait_for_task(vmdk_copy_task) [ 1237.613020] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1237.613020] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] return self.wait_for_task(task_ref) [ 1237.613020] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1237.613020] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] return evt.wait() [ 1237.613020] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1237.613020] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] result = hub.switch() [ 1237.613020] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1237.613020] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] return self.greenlet.switch() [ 1237.613865] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1237.613865] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] self.f(*self.args, **self.kw) [ 1237.613865] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1237.613865] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] raise exceptions.translate_fault(task_info.error) [ 1237.613865] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1237.613865] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Faults: ['InvalidArgument'] [ 1237.613865] env[65680]: ERROR nova.compute.manager [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] [ 1237.613865] env[65680]: INFO nova.compute.manager [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Terminating instance [ 1237.615099] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1237.615326] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1237.615934] env[65680]: DEBUG nova.compute.manager [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1237.616157] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1237.616382] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1d59c29a-45ce-49e7-bb05-0a5f6a8661d0 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1237.618528] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5c483f6-5bf9-497b-ae20-3eaeca876dec {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1237.624916] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1237.625138] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-05640653-0e16-4ed7-8fc9-6e669a707916 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1237.627282] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1237.627455] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1237.628369] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ba1851d0-9da3-4606-b3df-f7a45f94275a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1237.633520] env[65680]: DEBUG oslo_vmware.api [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Waiting for the task: (returnval){ [ 1237.633520] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52cf5a30-7041-8d57-31ff-00c74edd188f" [ 1237.633520] env[65680]: _type = "Task" [ 1237.633520] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1237.641072] env[65680]: DEBUG oslo_vmware.api [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]52cf5a30-7041-8d57-31ff-00c74edd188f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1237.698073] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1237.698316] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1237.698490] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Deleting the datastore file [datastore1] c6953476-8f7a-4314-a88e-cd5d02c3309f {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1237.698737] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4d046d93-b93e-4aa6-934a-4b80097620bc {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1237.704596] env[65680]: DEBUG oslo_vmware.api [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Waiting for the task: (returnval){ [ 1237.704596] env[65680]: value = "task-2847977" [ 1237.704596] env[65680]: _type = "Task" [ 1237.704596] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1237.711821] env[65680]: DEBUG oslo_vmware.api [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Task: {'id': task-2847977, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1238.144170] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1238.144446] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Creating directory with path [datastore1] vmware_temp/ca4784be-4e78-4522-bf09-e324552c7ad2/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1238.144673] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d6dbb131-4d47-4aa1-a97f-664668070414 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1238.156708] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Created directory with path [datastore1] vmware_temp/ca4784be-4e78-4522-bf09-e324552c7ad2/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1238.156885] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Fetch image to [datastore1] vmware_temp/ca4784be-4e78-4522-bf09-e324552c7ad2/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1238.157063] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/ca4784be-4e78-4522-bf09-e324552c7ad2/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1238.157742] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8684de14-0b23-4ee7-b64a-f939298b7aa4 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1238.164155] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37441e6f-2310-49df-9b10-c73e0f42232d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1238.172970] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa85fd4f-d17c-4e51-bf23-4e376c4884a2 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1238.202402] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b59f31e-a061-4b89-84dd-4caa6071369c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1238.209463] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4e925739-ee98-4c71-8891-8682a3c9356d {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1238.213584] env[65680]: DEBUG oslo_vmware.api [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Task: {'id': task-2847977, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077257} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1238.214130] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1238.214319] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1238.214481] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1238.214648] env[65680]: INFO nova.compute.manager [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1238.216734] env[65680]: DEBUG nova.compute.claims [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1238.216900] env[65680]: DEBUG oslo_concurrency.lockutils [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1238.217131] env[65680]: DEBUG oslo_concurrency.lockutils [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1238.231968] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1238.246674] env[65680]: DEBUG oslo_concurrency.lockutils [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.029s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1238.247360] env[65680]: DEBUG nova.compute.utils [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Instance c6953476-8f7a-4314-a88e-cd5d02c3309f could not be found. {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1238.248790] env[65680]: DEBUG nova.compute.manager [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Instance disappeared during build. {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1238.248956] env[65680]: DEBUG nova.compute.manager [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1238.249129] env[65680]: DEBUG nova.compute.manager [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1238.249336] env[65680]: DEBUG nova.compute.manager [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1238.249502] env[65680]: DEBUG nova.network.neutron [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1238.277043] env[65680]: DEBUG oslo_vmware.rw_handles [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ca4784be-4e78-4522-bf09-e324552c7ad2/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1238.279071] env[65680]: DEBUG nova.network.neutron [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1238.331375] env[65680]: INFO nova.compute.manager [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] Took 0.08 seconds to deallocate network for instance. [ 1238.335824] env[65680]: DEBUG oslo_vmware.rw_handles [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Completed reading data from the image iterator. {{(pid=65680) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1238.335978] env[65680]: DEBUG oslo_vmware.rw_handles [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ca4784be-4e78-4522-bf09-e324552c7ad2/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1238.374218] env[65680]: DEBUG oslo_concurrency.lockutils [None req-07389c0c-5497-446a-b99d-4e22ff2c77ac tempest-ServersV294TestFqdnHostnames-1436406955 tempest-ServersV294TestFqdnHostnames-1436406955-project-member] Lock "c6953476-8f7a-4314-a88e-cd5d02c3309f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 234.577s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1238.374447] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "c6953476-8f7a-4314-a88e-cd5d02c3309f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 79.914s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1238.374630] env[65680]: INFO nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: c6953476-8f7a-4314-a88e-cd5d02c3309f] During sync_power_state the instance has a pending task (spawning). Skip. [ 1238.374798] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "c6953476-8f7a-4314-a88e-cd5d02c3309f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1239.131871] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1239.287741] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1241.145185] env[65680]: DEBUG oslo_concurrency.lockutils [None req-6020e4a5-7499-47ad-8f49-433af84d3aaa tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquiring lock "dac6ccec-b1a2-47d5-9750-d5f59f6743ae" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1285.034664] env[65680]: WARNING oslo_vmware.rw_handles [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1285.034664] env[65680]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1285.034664] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1285.034664] env[65680]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1285.034664] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1285.034664] env[65680]: ERROR oslo_vmware.rw_handles response.begin() [ 1285.034664] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1285.034664] env[65680]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1285.034664] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1285.034664] env[65680]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1285.034664] env[65680]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1285.034664] env[65680]: ERROR oslo_vmware.rw_handles [ 1285.035655] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Downloaded image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to vmware_temp/ca4784be-4e78-4522-bf09-e324552c7ad2/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1285.036872] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Caching image {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1285.037130] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Copying Virtual Disk [datastore1] vmware_temp/ca4784be-4e78-4522-bf09-e324552c7ad2/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk to [datastore1] vmware_temp/ca4784be-4e78-4522-bf09-e324552c7ad2/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk {{(pid=65680) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1285.037434] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2720b9c9-7c7a-45c3-b5c5-7d8409d0f98e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1285.045538] env[65680]: DEBUG oslo_vmware.api [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Waiting for the task: (returnval){ [ 1285.045538] env[65680]: value = "task-2847978" [ 1285.045538] env[65680]: _type = "Task" [ 1285.045538] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1285.052857] env[65680]: DEBUG oslo_vmware.api [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Task: {'id': task-2847978, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1285.555779] env[65680]: DEBUG oslo_vmware.exceptions [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Fault InvalidArgument not matched. {{(pid=65680) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1285.556033] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1285.556607] env[65680]: ERROR nova.compute.manager [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1285.556607] env[65680]: Faults: ['InvalidArgument'] [ 1285.556607] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Traceback (most recent call last): [ 1285.556607] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1285.556607] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] yield resources [ 1285.556607] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1285.556607] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] self.driver.spawn(context, instance, image_meta, [ 1285.556607] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1285.556607] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1285.556607] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1285.556607] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] self._fetch_image_if_missing(context, vi) [ 1285.556607] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1285.556986] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] image_cache(vi, tmp_image_ds_loc) [ 1285.556986] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1285.556986] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] vm_util.copy_virtual_disk( [ 1285.556986] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1285.556986] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] session._wait_for_task(vmdk_copy_task) [ 1285.556986] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1285.556986] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] return self.wait_for_task(task_ref) [ 1285.556986] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1285.556986] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] return evt.wait() [ 1285.556986] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1285.556986] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] result = hub.switch() [ 1285.556986] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1285.556986] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] return self.greenlet.switch() [ 1285.557498] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1285.557498] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] self.f(*self.args, **self.kw) [ 1285.557498] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1285.557498] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] raise exceptions.translate_fault(task_info.error) [ 1285.557498] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1285.557498] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Faults: ['InvalidArgument'] [ 1285.557498] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] [ 1285.557498] env[65680]: INFO nova.compute.manager [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Terminating instance [ 1285.559654] env[65680]: DEBUG nova.compute.manager [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1285.559843] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1285.560560] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dde45121-dcca-4f59-865e-f6e9ecf92531 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1285.566719] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1285.566920] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c220696c-35f0-48e3-9bb1-87de923baec2 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1285.630467] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1285.630650] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1285.630790] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Deleting the datastore file [datastore1] dac6ccec-b1a2-47d5-9750-d5f59f6743ae {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1285.631042] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-eb4a1ef4-cdb5-4924-b9b4-64e8d43bb0a5 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1285.637167] env[65680]: DEBUG oslo_vmware.api [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Waiting for the task: (returnval){ [ 1285.637167] env[65680]: value = "task-2847980" [ 1285.637167] env[65680]: _type = "Task" [ 1285.637167] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1285.644037] env[65680]: DEBUG oslo_vmware.api [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Task: {'id': task-2847980, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1286.147436] env[65680]: DEBUG oslo_vmware.api [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Task: {'id': task-2847980, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.059703} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1286.147822] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1286.147865] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1286.148044] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1286.148221] env[65680]: INFO nova.compute.manager [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1286.150302] env[65680]: DEBUG nova.compute.claims [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1286.150471] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1286.150687] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1286.215036] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d511399-3c7c-4787-948d-4a483976583a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1286.221452] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94a46feb-369d-4830-98fb-55077dea34b7 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1286.251135] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89f1d084-8b7a-4d43-bd37-11604b08ea8c {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1286.257636] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaf5ba8b-301a-4d2c-8779-1b6b61779851 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1286.269893] env[65680]: DEBUG nova.compute.provider_tree [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1286.277920] env[65680]: DEBUG nova.scheduler.client.report [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1286.290229] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.139s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1286.290719] env[65680]: ERROR nova.compute.manager [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1286.290719] env[65680]: Faults: ['InvalidArgument'] [ 1286.290719] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Traceback (most recent call last): [ 1286.290719] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1286.290719] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] self.driver.spawn(context, instance, image_meta, [ 1286.290719] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1286.290719] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1286.290719] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1286.290719] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] self._fetch_image_if_missing(context, vi) [ 1286.290719] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1286.290719] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] image_cache(vi, tmp_image_ds_loc) [ 1286.290719] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1286.291123] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] vm_util.copy_virtual_disk( [ 1286.291123] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1286.291123] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] session._wait_for_task(vmdk_copy_task) [ 1286.291123] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1286.291123] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] return self.wait_for_task(task_ref) [ 1286.291123] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1286.291123] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] return evt.wait() [ 1286.291123] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1286.291123] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] result = hub.switch() [ 1286.291123] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1286.291123] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] return self.greenlet.switch() [ 1286.291123] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1286.291123] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] self.f(*self.args, **self.kw) [ 1286.291493] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1286.291493] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] raise exceptions.translate_fault(task_info.error) [ 1286.291493] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1286.291493] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Faults: ['InvalidArgument'] [ 1286.291493] env[65680]: ERROR nova.compute.manager [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] [ 1286.291493] env[65680]: DEBUG nova.compute.utils [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] VimFaultException {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1286.292572] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1286.292991] env[65680]: DEBUG nova.compute.manager [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Build of instance dac6ccec-b1a2-47d5-9750-d5f59f6743ae was re-scheduled: A specified parameter was not correct: fileType [ 1286.292991] env[65680]: Faults: ['InvalidArgument'] {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1286.293374] env[65680]: DEBUG nova.compute.manager [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1286.293539] env[65680]: DEBUG nova.compute.manager [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1286.293704] env[65680]: DEBUG nova.compute.manager [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1286.293857] env[65680]: DEBUG nova.network.neutron [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1286.295414] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1286.554354] env[65680]: DEBUG nova.network.neutron [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1286.566130] env[65680]: INFO nova.compute.manager [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Took 0.27 seconds to deallocate network for instance. [ 1286.660826] env[65680]: INFO nova.scheduler.client.report [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Deleted allocations for instance dac6ccec-b1a2-47d5-9750-d5f59f6743ae [ 1286.676941] env[65680]: DEBUG oslo_concurrency.lockutils [None req-e220d3b2-fd6f-4c48-9e91-27fffb4da80e tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "dac6ccec-b1a2-47d5-9750-d5f59f6743ae" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 241.514s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1286.677200] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "dac6ccec-b1a2-47d5-9750-d5f59f6743ae" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 128.217s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1286.677388] env[65680]: INFO nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] During sync_power_state the instance has a pending task (spawning). Skip. [ 1286.677584] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "dac6ccec-b1a2-47d5-9750-d5f59f6743ae" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1286.677822] env[65680]: DEBUG oslo_concurrency.lockutils [None req-6020e4a5-7499-47ad-8f49-433af84d3aaa tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "dac6ccec-b1a2-47d5-9750-d5f59f6743ae" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 45.533s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1286.678040] env[65680]: DEBUG oslo_concurrency.lockutils [None req-6020e4a5-7499-47ad-8f49-433af84d3aaa tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquiring lock "dac6ccec-b1a2-47d5-9750-d5f59f6743ae-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1286.678244] env[65680]: DEBUG oslo_concurrency.lockutils [None req-6020e4a5-7499-47ad-8f49-433af84d3aaa tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "dac6ccec-b1a2-47d5-9750-d5f59f6743ae-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1286.678405] env[65680]: DEBUG oslo_concurrency.lockutils [None req-6020e4a5-7499-47ad-8f49-433af84d3aaa tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "dac6ccec-b1a2-47d5-9750-d5f59f6743ae-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1286.680306] env[65680]: INFO nova.compute.manager [None req-6020e4a5-7499-47ad-8f49-433af84d3aaa tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Terminating instance [ 1286.681956] env[65680]: DEBUG nova.compute.manager [None req-6020e4a5-7499-47ad-8f49-433af84d3aaa tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1286.682164] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-6020e4a5-7499-47ad-8f49-433af84d3aaa tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1286.682602] env[65680]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2b97491f-a2de-43d3-873b-ac5eb580f5ac {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1286.691468] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2828c3ee-888a-46ef-8763-c262f7805dfc {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1286.713533] env[65680]: WARNING nova.virt.vmwareapi.vmops [None req-6020e4a5-7499-47ad-8f49-433af84d3aaa tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance dac6ccec-b1a2-47d5-9750-d5f59f6743ae could not be found. [ 1286.713749] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-6020e4a5-7499-47ad-8f49-433af84d3aaa tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1286.713924] env[65680]: INFO nova.compute.manager [None req-6020e4a5-7499-47ad-8f49-433af84d3aaa tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Took 0.03 seconds to destroy the instance on the hypervisor. [ 1286.714176] env[65680]: DEBUG oslo.service.loopingcall [None req-6020e4a5-7499-47ad-8f49-433af84d3aaa tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1286.714377] env[65680]: DEBUG nova.compute.manager [-] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1286.714487] env[65680]: DEBUG nova.network.neutron [-] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1286.738028] env[65680]: DEBUG nova.network.neutron [-] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1286.745906] env[65680]: INFO nova.compute.manager [-] [instance: dac6ccec-b1a2-47d5-9750-d5f59f6743ae] Took 0.03 seconds to deallocate network for instance. [ 1286.822332] env[65680]: DEBUG oslo_concurrency.lockutils [None req-6020e4a5-7499-47ad-8f49-433af84d3aaa tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "dac6ccec-b1a2-47d5-9750-d5f59f6743ae" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.144s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1288.108025] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquiring lock "7c9cf924-33bb-47b0-bd87-1d221b6fba5b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1288.108312] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "7c9cf924-33bb-47b0-bd87-1d221b6fba5b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1288.116925] env[65680]: DEBUG nova.compute.manager [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Starting instance... {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1288.158883] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1288.159085] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1288.160408] env[65680]: INFO nova.compute.claims [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1288.228742] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bf599ad-7d17-4ff0-8633-e2be41621942 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1288.236265] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f4a6145-c802-40f2-889b-b97a5fef2ef5 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1288.265282] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d8ecdc5-bff9-4166-90c8-69a27d05900f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1288.271776] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bbd9726-6eef-494b-be3c-ce0f37d62270 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1288.284213] env[65680]: DEBUG nova.compute.provider_tree [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1288.292282] env[65680]: DEBUG nova.scheduler.client.report [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1288.305744] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.147s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1288.306196] env[65680]: DEBUG nova.compute.manager [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Start building networks asynchronously for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1288.336906] env[65680]: DEBUG nova.compute.utils [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Using /dev/sd instead of None {{(pid=65680) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1288.338126] env[65680]: DEBUG nova.compute.manager [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Allocating IP information in the background. {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1288.338295] env[65680]: DEBUG nova.network.neutron [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] allocate_for_instance() {{(pid=65680) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1288.345485] env[65680]: DEBUG nova.compute.manager [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Start building block device mappings for instance. {{(pid=65680) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1288.391036] env[65680]: DEBUG nova.policy [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0c0a4078f7644f57884a39d3369ceb7e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4ec0d6e13ecf4b72b79052a4077a754f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65680) authorize /opt/stack/nova/nova/policy.py:203}} [ 1288.402041] env[65680]: DEBUG nova.compute.manager [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Start spawning the instance on the hypervisor. {{(pid=65680) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1288.421536] env[65680]: DEBUG nova.virt.hardware [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:08:01Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:07:43Z,direct_url=,disk_format='vmdk',id=43113302-7f85-4bd9-95eb-c8e71f92d770,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='37ed348cd06e407e8b18e9a9365b037b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:07:44Z,virtual_size=,visibility=), allow threads: False {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1288.421768] env[65680]: DEBUG nova.virt.hardware [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Flavor limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1288.421937] env[65680]: DEBUG nova.virt.hardware [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Image limits 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1288.422148] env[65680]: DEBUG nova.virt.hardware [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Flavor pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1288.422295] env[65680]: DEBUG nova.virt.hardware [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Image pref 0:0:0 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1288.422438] env[65680]: DEBUG nova.virt.hardware [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65680) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1288.422638] env[65680]: DEBUG nova.virt.hardware [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1288.422793] env[65680]: DEBUG nova.virt.hardware [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1288.422954] env[65680]: DEBUG nova.virt.hardware [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Got 1 possible topologies {{(pid=65680) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1288.423128] env[65680]: DEBUG nova.virt.hardware [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1288.423340] env[65680]: DEBUG nova.virt.hardware [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65680) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1288.424362] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68efcd16-4932-4f6e-ada8-9f9851a7e037 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1288.432176] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdbe4700-a743-4f88-b835-19a391717618 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1288.661095] env[65680]: DEBUG nova.network.neutron [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Successfully created port: a47ec588-a539-4517-aa79-661fd6b423f3 {{(pid=65680) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1289.165796] env[65680]: DEBUG nova.compute.manager [req-c7eac916-2524-41b2-bbd6-99165c219a35 req-b06fa49e-46c9-4ca9-9e20-40f322b23942 service nova] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Received event network-vif-plugged-a47ec588-a539-4517-aa79-661fd6b423f3 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1289.166122] env[65680]: DEBUG oslo_concurrency.lockutils [req-c7eac916-2524-41b2-bbd6-99165c219a35 req-b06fa49e-46c9-4ca9-9e20-40f322b23942 service nova] Acquiring lock "7c9cf924-33bb-47b0-bd87-1d221b6fba5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1289.166122] env[65680]: DEBUG oslo_concurrency.lockutils [req-c7eac916-2524-41b2-bbd6-99165c219a35 req-b06fa49e-46c9-4ca9-9e20-40f322b23942 service nova] Lock "7c9cf924-33bb-47b0-bd87-1d221b6fba5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1289.166392] env[65680]: DEBUG oslo_concurrency.lockutils [req-c7eac916-2524-41b2-bbd6-99165c219a35 req-b06fa49e-46c9-4ca9-9e20-40f322b23942 service nova] Lock "7c9cf924-33bb-47b0-bd87-1d221b6fba5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1289.166434] env[65680]: DEBUG nova.compute.manager [req-c7eac916-2524-41b2-bbd6-99165c219a35 req-b06fa49e-46c9-4ca9-9e20-40f322b23942 service nova] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] No waiting events found dispatching network-vif-plugged-a47ec588-a539-4517-aa79-661fd6b423f3 {{(pid=65680) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1289.166599] env[65680]: WARNING nova.compute.manager [req-c7eac916-2524-41b2-bbd6-99165c219a35 req-b06fa49e-46c9-4ca9-9e20-40f322b23942 service nova] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Received unexpected event network-vif-plugged-a47ec588-a539-4517-aa79-661fd6b423f3 for instance with vm_state building and task_state spawning. [ 1289.236979] env[65680]: DEBUG nova.network.neutron [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Successfully updated port: a47ec588-a539-4517-aa79-661fd6b423f3 {{(pid=65680) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1289.249321] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquiring lock "refresh_cache-7c9cf924-33bb-47b0-bd87-1d221b6fba5b" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1289.249461] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquired lock "refresh_cache-7c9cf924-33bb-47b0-bd87-1d221b6fba5b" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1289.249594] env[65680]: DEBUG nova.network.neutron [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Building network info cache for instance {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1289.280015] env[65680]: DEBUG nova.network.neutron [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Instance cache missing network info. {{(pid=65680) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1289.423805] env[65680]: DEBUG nova.network.neutron [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Updating instance_info_cache with network_info: [{"id": "a47ec588-a539-4517-aa79-661fd6b423f3", "address": "fa:16:3e:14:86:8c", "network": {"id": "88155a8f-de41-4a3d-8802-3a6c5765e431", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-792625309-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ec0d6e13ecf4b72b79052a4077a754f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a06a63d6-2aeb-4084-8022-f804cac3fa74", "external-id": "nsx-vlan-transportzone-797", "segmentation_id": 797, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa47ec588-a5", "ovs_interfaceid": "a47ec588-a539-4517-aa79-661fd6b423f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1289.436109] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Releasing lock "refresh_cache-7c9cf924-33bb-47b0-bd87-1d221b6fba5b" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1289.436384] env[65680]: DEBUG nova.compute.manager [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Instance network_info: |[{"id": "a47ec588-a539-4517-aa79-661fd6b423f3", "address": "fa:16:3e:14:86:8c", "network": {"id": "88155a8f-de41-4a3d-8802-3a6c5765e431", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-792625309-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ec0d6e13ecf4b72b79052a4077a754f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a06a63d6-2aeb-4084-8022-f804cac3fa74", "external-id": "nsx-vlan-transportzone-797", "segmentation_id": 797, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa47ec588-a5", "ovs_interfaceid": "a47ec588-a539-4517-aa79-661fd6b423f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65680) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1289.436784] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:14:86:8c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a06a63d6-2aeb-4084-8022-f804cac3fa74', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a47ec588-a539-4517-aa79-661fd6b423f3', 'vif_model': 'vmxnet3'}] {{(pid=65680) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1289.444165] env[65680]: DEBUG oslo.service.loopingcall [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65680) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1289.444559] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Creating VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1289.444773] env[65680]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-fe27d639-6181-485f-9af8-20542d56e259 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1289.464743] env[65680]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1289.464743] env[65680]: value = "task-2847981" [ 1289.464743] env[65680]: _type = "Task" [ 1289.464743] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1289.472166] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847981, 'name': CreateVM_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1289.975107] env[65680]: DEBUG oslo_vmware.api [-] Task: {'id': task-2847981, 'name': CreateVM_Task, 'duration_secs': 0.287326} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1289.975355] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Created VM on the ESX host {{(pid=65680) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1289.981955] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1289.982139] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1289.982456] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1289.982689] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a07f2300-1de8-4e55-9930-cdfe8a6271ec {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1289.987236] env[65680]: DEBUG oslo_vmware.api [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Waiting for the task: (returnval){ [ 1289.987236] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]5229ddbc-0010-8ca9-b2c7-23058cfad9c8" [ 1289.987236] env[65680]: _type = "Task" [ 1289.987236] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1289.994858] env[65680]: DEBUG oslo_vmware.api [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Task: {'id': session[524bad36-3b0e-1fd0-435e-68a984055d4c]5229ddbc-0010-8ca9-b2c7-23058cfad9c8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1290.498233] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1290.498673] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Processing image 43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1290.498760] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1290.498829] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquired lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1290.498993] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1290.499231] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5eb97f3b-2c41-4a98-9ad6-cc60c0a2ccca {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.515385] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1290.515560] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65680) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1290.516214] env[65680]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2a4aa28c-d983-4dfe-b0ec-f93a4406519e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.521161] env[65680]: DEBUG oslo_vmware.api [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Waiting for the task: (returnval){ [ 1290.521161] env[65680]: value = "session[524bad36-3b0e-1fd0-435e-68a984055d4c]52abd050-466c-ec86-a337-52d11cc69b2b" [ 1290.521161] env[65680]: _type = "Task" [ 1290.521161] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1290.533999] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Preparing fetch location {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1290.534234] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Creating directory with path [datastore1] vmware_temp/e33737dd-0838-4cb5-ab27-2dfd3b7968f7/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1290.534432] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b6f79d5f-d637-4158-99c4-bcf28956752f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.553563] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Created directory with path [datastore1] vmware_temp/e33737dd-0838-4cb5-ab27-2dfd3b7968f7/43113302-7f85-4bd9-95eb-c8e71f92d770 {{(pid=65680) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1290.553738] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Fetch image to [datastore1] vmware_temp/e33737dd-0838-4cb5-ab27-2dfd3b7968f7/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1290.553903] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to [datastore1] vmware_temp/e33737dd-0838-4cb5-ab27-2dfd3b7968f7/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1290.554573] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1ab0088-76dd-447f-8acd-35d675b20066 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.560845] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e866891-9932-4e74-93a2-308e3f677900 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.569428] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e943a8fd-a112-485b-80b3-b1b3a682a9e5 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.598489] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edab257b-d7f2-46e3-bed8-814946a09cbf {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.603518] env[65680]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fce5f047-7703-4404-8dff-4fa2eb8c27e6 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.625253] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Downloading image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1290.668010] env[65680]: DEBUG oslo_vmware.rw_handles [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e33737dd-0838-4cb5-ab27-2dfd3b7968f7/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1290.725978] env[65680]: DEBUG oslo_vmware.rw_handles [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Completed reading data from the image iterator. {{(pid=65680) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1290.726200] env[65680]: DEBUG oslo_vmware.rw_handles [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e33737dd-0838-4cb5-ab27-2dfd3b7968f7/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65680) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1291.190565] env[65680]: DEBUG nova.compute.manager [req-e55590df-15a9-485b-b8b4-2698cd8f6f93 req-470c9388-3be7-4f6f-a453-4fe6d7e0ac47 service nova] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Received event network-changed-a47ec588-a539-4517-aa79-661fd6b423f3 {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1291.190687] env[65680]: DEBUG nova.compute.manager [req-e55590df-15a9-485b-b8b4-2698cd8f6f93 req-470c9388-3be7-4f6f-a453-4fe6d7e0ac47 service nova] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Refreshing instance network info cache due to event network-changed-a47ec588-a539-4517-aa79-661fd6b423f3. {{(pid=65680) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1291.190877] env[65680]: DEBUG oslo_concurrency.lockutils [req-e55590df-15a9-485b-b8b4-2698cd8f6f93 req-470c9388-3be7-4f6f-a453-4fe6d7e0ac47 service nova] Acquiring lock "refresh_cache-7c9cf924-33bb-47b0-bd87-1d221b6fba5b" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1291.191046] env[65680]: DEBUG oslo_concurrency.lockutils [req-e55590df-15a9-485b-b8b4-2698cd8f6f93 req-470c9388-3be7-4f6f-a453-4fe6d7e0ac47 service nova] Acquired lock "refresh_cache-7c9cf924-33bb-47b0-bd87-1d221b6fba5b" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1291.191173] env[65680]: DEBUG nova.network.neutron [req-e55590df-15a9-485b-b8b4-2698cd8f6f93 req-470c9388-3be7-4f6f-a453-4fe6d7e0ac47 service nova] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Refreshing network info cache for port a47ec588-a539-4517-aa79-661fd6b423f3 {{(pid=65680) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1291.293145] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1291.293333] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Starting heal instance info cache {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1291.293456] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Rebuilding the list of instances to heal {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1291.304220] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Skipping network cache update for instance because it is Building. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1291.304369] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Didn't find any instances for network info cache update. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1291.421096] env[65680]: DEBUG nova.network.neutron [req-e55590df-15a9-485b-b8b4-2698cd8f6f93 req-470c9388-3be7-4f6f-a453-4fe6d7e0ac47 service nova] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Updated VIF entry in instance network info cache for port a47ec588-a539-4517-aa79-661fd6b423f3. {{(pid=65680) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1291.421468] env[65680]: DEBUG nova.network.neutron [req-e55590df-15a9-485b-b8b4-2698cd8f6f93 req-470c9388-3be7-4f6f-a453-4fe6d7e0ac47 service nova] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Updating instance_info_cache with network_info: [{"id": "a47ec588-a539-4517-aa79-661fd6b423f3", "address": "fa:16:3e:14:86:8c", "network": {"id": "88155a8f-de41-4a3d-8802-3a6c5765e431", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-792625309-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ec0d6e13ecf4b72b79052a4077a754f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a06a63d6-2aeb-4084-8022-f804cac3fa74", "external-id": "nsx-vlan-transportzone-797", "segmentation_id": 797, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa47ec588-a5", "ovs_interfaceid": "a47ec588-a539-4517-aa79-661fd6b423f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1291.431205] env[65680]: DEBUG oslo_concurrency.lockutils [req-e55590df-15a9-485b-b8b4-2698cd8f6f93 req-470c9388-3be7-4f6f-a453-4fe6d7e0ac47 service nova] Releasing lock "refresh_cache-7c9cf924-33bb-47b0-bd87-1d221b6fba5b" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1293.292807] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1293.293253] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65680) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1296.294099] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1296.294476] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1297.293021] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1297.293262] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager.update_available_resource {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1297.302813] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1297.303127] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1297.303208] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1297.303381] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65680) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1297.304696] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e818771a-1b8f-4ac4-81c7-95c705271e38 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1297.312878] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa03a23a-2570-487c-b3d3-9837c076c3a7 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1297.326338] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76c870c0-0e1d-4e8a-bb86-cb080da9961a {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1297.332322] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa86f8b6-0e90-4d9a-8d1c-d7ababfe80e9 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1297.360462] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181071MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65680) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1297.360613] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1297.360766] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1297.395981] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Instance 7c9cf924-33bb-47b0-bd87-1d221b6fba5b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65680) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1297.396187] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1297.396330] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1297.419936] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca59098d-2515-441f-9698-f3945916675e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1297.426501] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86962c56-a2ca-484a-bc46-2b9f471aee44 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1297.456260] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-229fd5f8-4429-4e8f-88e9-3a8f42c0969b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1297.462802] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8fe043bf-2db8-49ea-af12-bb40d32cdc96 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1297.475314] env[65680]: DEBUG nova.compute.provider_tree [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1297.483558] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1297.497451] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65680) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1297.497651] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.137s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1300.498627] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1338.496889] env[65680]: WARNING oslo_vmware.rw_handles [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1338.496889] env[65680]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1338.496889] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1338.496889] env[65680]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1338.496889] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1338.496889] env[65680]: ERROR oslo_vmware.rw_handles response.begin() [ 1338.496889] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1338.496889] env[65680]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1338.496889] env[65680]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1338.496889] env[65680]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1338.496889] env[65680]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1338.496889] env[65680]: ERROR oslo_vmware.rw_handles [ 1338.497652] env[65680]: DEBUG nova.virt.vmwareapi.images [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Downloaded image file data 43113302-7f85-4bd9-95eb-c8e71f92d770 to vmware_temp/e33737dd-0838-4cb5-ab27-2dfd3b7968f7/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk on the data store datastore1 {{(pid=65680) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1338.498962] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Caching image {{(pid=65680) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1338.499234] env[65680]: DEBUG nova.virt.vmwareapi.vm_util [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Copying Virtual Disk [datastore1] vmware_temp/e33737dd-0838-4cb5-ab27-2dfd3b7968f7/43113302-7f85-4bd9-95eb-c8e71f92d770/tmp-sparse.vmdk to [datastore1] vmware_temp/e33737dd-0838-4cb5-ab27-2dfd3b7968f7/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk {{(pid=65680) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1338.499524] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-270253a9-95cd-4732-b754-804ca82cb489 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1338.507711] env[65680]: DEBUG oslo_vmware.api [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Waiting for the task: (returnval){ [ 1338.507711] env[65680]: value = "task-2847982" [ 1338.507711] env[65680]: _type = "Task" [ 1338.507711] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1338.515233] env[65680]: DEBUG oslo_vmware.api [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Task: {'id': task-2847982, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1339.018460] env[65680]: DEBUG oslo_vmware.exceptions [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Fault InvalidArgument not matched. {{(pid=65680) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1339.018705] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Releasing lock "[datastore1] devstack-image-cache_base/43113302-7f85-4bd9-95eb-c8e71f92d770/43113302-7f85-4bd9-95eb-c8e71f92d770.vmdk" {{(pid=65680) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1339.019250] env[65680]: ERROR nova.compute.manager [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1339.019250] env[65680]: Faults: ['InvalidArgument'] [ 1339.019250] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Traceback (most recent call last): [ 1339.019250] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1339.019250] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] yield resources [ 1339.019250] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1339.019250] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] self.driver.spawn(context, instance, image_meta, [ 1339.019250] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1339.019250] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1339.019250] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1339.019250] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] self._fetch_image_if_missing(context, vi) [ 1339.019250] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1339.019781] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] image_cache(vi, tmp_image_ds_loc) [ 1339.019781] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1339.019781] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] vm_util.copy_virtual_disk( [ 1339.019781] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1339.019781] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] session._wait_for_task(vmdk_copy_task) [ 1339.019781] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1339.019781] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] return self.wait_for_task(task_ref) [ 1339.019781] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1339.019781] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] return evt.wait() [ 1339.019781] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1339.019781] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] result = hub.switch() [ 1339.019781] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1339.019781] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] return self.greenlet.switch() [ 1339.020233] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1339.020233] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] self.f(*self.args, **self.kw) [ 1339.020233] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1339.020233] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] raise exceptions.translate_fault(task_info.error) [ 1339.020233] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1339.020233] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Faults: ['InvalidArgument'] [ 1339.020233] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] [ 1339.020233] env[65680]: INFO nova.compute.manager [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Terminating instance [ 1339.022281] env[65680]: DEBUG nova.compute.manager [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Start destroying the instance on the hypervisor. {{(pid=65680) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1339.022479] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Destroying instance {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1339.023226] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0026b3e8-2727-4b09-b141-09421d117e17 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1339.029635] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Unregistering the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1339.029848] env[65680]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9caa4c42-7837-4720-916a-60327274c034 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1339.088542] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Unregistered the VM {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1339.088747] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Deleting contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1339.088926] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Deleting the datastore file [datastore1] 7c9cf924-33bb-47b0-bd87-1d221b6fba5b {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1339.089206] env[65680]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-789bc212-7463-4982-829d-16850b066f5e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1339.094834] env[65680]: DEBUG oslo_vmware.api [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Waiting for the task: (returnval){ [ 1339.094834] env[65680]: value = "task-2847984" [ 1339.094834] env[65680]: _type = "Task" [ 1339.094834] env[65680]: } to complete. {{(pid=65680) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1339.102408] env[65680]: DEBUG oslo_vmware.api [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Task: {'id': task-2847984, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1339.604620] env[65680]: DEBUG oslo_vmware.api [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Task: {'id': task-2847984, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067581} completed successfully. {{(pid=65680) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1339.604986] env[65680]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Deleted the datastore file {{(pid=65680) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1339.604986] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Deleted contents of the VM from datastore datastore1 {{(pid=65680) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1339.605198] env[65680]: DEBUG nova.virt.vmwareapi.vmops [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Instance destroyed {{(pid=65680) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1339.605377] env[65680]: INFO nova.compute.manager [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Took 0.58 seconds to destroy the instance on the hypervisor. [ 1339.607468] env[65680]: DEBUG nova.compute.claims [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Aborting claim: {{(pid=65680) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1339.607646] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1339.607860] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1339.669244] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3703c5a6-a5be-4eeb-b6c5-930dd96b42fa {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1339.677573] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3afc6ca8-3134-47f5-af6e-a2348c7ed140 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1339.706364] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68b9889b-2a9d-4651-89dc-b743eefa8c7f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1339.713176] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a31241da-561e-4700-96d8-7d05b73d682b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1339.725906] env[65680]: DEBUG nova.compute.provider_tree [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1339.733841] env[65680]: DEBUG nova.scheduler.client.report [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1339.745978] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.138s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1339.746494] env[65680]: ERROR nova.compute.manager [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1339.746494] env[65680]: Faults: ['InvalidArgument'] [ 1339.746494] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Traceback (most recent call last): [ 1339.746494] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1339.746494] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] self.driver.spawn(context, instance, image_meta, [ 1339.746494] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1339.746494] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1339.746494] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1339.746494] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] self._fetch_image_if_missing(context, vi) [ 1339.746494] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1339.746494] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] image_cache(vi, tmp_image_ds_loc) [ 1339.746494] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1339.746819] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] vm_util.copy_virtual_disk( [ 1339.746819] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1339.746819] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] session._wait_for_task(vmdk_copy_task) [ 1339.746819] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1339.746819] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] return self.wait_for_task(task_ref) [ 1339.746819] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1339.746819] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] return evt.wait() [ 1339.746819] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1339.746819] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] result = hub.switch() [ 1339.746819] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1339.746819] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] return self.greenlet.switch() [ 1339.746819] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1339.746819] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] self.f(*self.args, **self.kw) [ 1339.747181] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1339.747181] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] raise exceptions.translate_fault(task_info.error) [ 1339.747181] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1339.747181] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Faults: ['InvalidArgument'] [ 1339.747181] env[65680]: ERROR nova.compute.manager [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] [ 1339.747181] env[65680]: DEBUG nova.compute.utils [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] VimFaultException {{(pid=65680) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1339.748504] env[65680]: DEBUG nova.compute.manager [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Build of instance 7c9cf924-33bb-47b0-bd87-1d221b6fba5b was re-scheduled: A specified parameter was not correct: fileType [ 1339.748504] env[65680]: Faults: ['InvalidArgument'] {{(pid=65680) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1339.748880] env[65680]: DEBUG nova.compute.manager [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Unplugging VIFs for instance {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1339.749069] env[65680]: DEBUG nova.compute.manager [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65680) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1339.749240] env[65680]: DEBUG nova.compute.manager [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Deallocating network for instance {{(pid=65680) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1339.749405] env[65680]: DEBUG nova.network.neutron [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] deallocate_for_instance() {{(pid=65680) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1339.969790] env[65680]: DEBUG nova.network.neutron [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Updating instance_info_cache with network_info: [] {{(pid=65680) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1339.984957] env[65680]: INFO nova.compute.manager [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] [instance: 7c9cf924-33bb-47b0-bd87-1d221b6fba5b] Took 0.24 seconds to deallocate network for instance. [ 1340.063753] env[65680]: INFO nova.scheduler.client.report [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Deleted allocations for instance 7c9cf924-33bb-47b0-bd87-1d221b6fba5b [ 1340.080333] env[65680]: DEBUG oslo_concurrency.lockutils [None req-ff4a91f9-5396-4ebf-b9d8-5a7f6669f241 tempest-ServerDiskConfigTestJSON-295581907 tempest-ServerDiskConfigTestJSON-295581907-project-member] Lock "7c9cf924-33bb-47b0-bd87-1d221b6fba5b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 51.972s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1347.293443] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1348.288270] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1353.293054] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1353.293054] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Starting heal instance info cache {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1353.293054] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Rebuilding the list of instances to heal {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1353.300888] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Didn't find any instances for network info cache update. {{(pid=65680) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1355.293984] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1355.294360] env[65680]: DEBUG nova.compute.manager [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65680) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1358.293694] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1358.294131] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1358.294131] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1358.294267] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager.update_available_resource {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1358.305621] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1358.305870] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1358.306054] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1358.306218] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65680) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1358.307583] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7c8123d-16b6-4795-b252-f88effb1aa31 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1358.315997] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa9cb323-83f5-4f07-b592-938dea918e29 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1358.329542] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-993f1d59-4703-4d3b-9a7f-10729444cace {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1358.335509] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5e2060f-a565-49ab-a148-dd8b4521773e {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1358.363800] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181059MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65680) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1358.363920] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1358.364125] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1358.393282] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1358.393448] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=65680) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1358.405905] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22eee579-9bbd-4876-9849-38716db8adef {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1358.412631] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2a5e1f8-01fe-400c-9e13-136f7b61901b {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1358.441893] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0aa50e91-cd3f-4ad6-b220-bfe9ed86054f {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1358.448234] env[65680]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b0d2854-7a25-4bee-b615-291b13927333 {{(pid=65680) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1358.460697] env[65680]: DEBUG nova.compute.provider_tree [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed in ProviderTree for provider: 93ae29e4-bd04-4c19-80be-8057217cf400 {{(pid=65680) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1358.468350] env[65680]: DEBUG nova.scheduler.client.report [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Inventory has not changed for provider 93ae29e4-bd04-4c19-80be-8057217cf400 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65680) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1358.480196] env[65680]: DEBUG nova.compute.resource_tracker [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65680) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1358.480366] env[65680]: DEBUG oslo_concurrency.lockutils [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.116s {{(pid=65680) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1362.479582] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1363.288587] env[65680]: DEBUG oslo_service.periodic_task [None req-00cadf03-b972-4dae-85a5-53053028b234 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=65680) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}}