[ 455.195284] nova-conductor[51600]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 456.414032] nova-conductor[51600]: DEBUG oslo_db.sqlalchemy.engines [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=51600) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 456.440347] nova-conductor[51600]: DEBUG nova.context [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),11743078-64bf-4468-a786-557c60808969(cell1) {{(pid=51600) load_cells /opt/stack/nova/nova/context.py:464}} [ 456.442174] nova-conductor[51600]: DEBUG oslo_concurrency.lockutils [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=51600) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 456.442375] nova-conductor[51600]: DEBUG oslo_concurrency.lockutils [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=51600) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 456.442827] nova-conductor[51600]: DEBUG oslo_concurrency.lockutils [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=51600) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 456.443177] nova-conductor[51600]: DEBUG oslo_concurrency.lockutils [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=51600) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 456.443354] nova-conductor[51600]: DEBUG oslo_concurrency.lockutils [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=51600) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 456.444249] nova-conductor[51600]: DEBUG oslo_concurrency.lockutils [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=51600) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 456.449412] nova-conductor[51600]: DEBUG oslo_db.sqlalchemy.engines [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=51600) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 456.449798] nova-conductor[51600]: DEBUG oslo_db.sqlalchemy.engines [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=51600) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 456.512250] nova-conductor[51600]: DEBUG oslo_concurrency.lockutils [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] Acquiring lock "singleton_lock" {{(pid=51600) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 456.512421] nova-conductor[51600]: DEBUG oslo_concurrency.lockutils [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] Acquired lock "singleton_lock" {{(pid=51600) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 456.512653] nova-conductor[51600]: DEBUG oslo_concurrency.lockutils [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] Releasing lock "singleton_lock" {{(pid=51600) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 456.513069] nova-conductor[51600]: INFO oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] Starting 2 workers [ 456.517466] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] Started child 52019 {{(pid=51600) _start_child /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:575}} [ 456.520819] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] Started child 52020 {{(pid=51600) _start_child /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:575}} [ 456.521410] nova-conductor[52019]: INFO nova.service [-] Starting conductor node (version 0.1.0) [ 456.521887] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] Full set of CONF: {{(pid=51600) wait /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:649}} [ 456.522120] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ******************************************************************************** {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 456.522284] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] Configuration options gathered from: {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 456.522455] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] command line args: ['--config-file', '/etc/nova/nova.conf'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 456.522809] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] config files: ['/etc/nova/nova.conf'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 456.522935] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ================================================================================ {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 456.523352] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] allow_resize_to_same_host = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.523567] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] arq_binding_timeout = 300 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.523758] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] block_device_allocate_retries = 60 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.523960] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] block_device_allocate_retries_interval = 3 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.524183] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cert = self.pem {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.524436] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] compute_driver = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.524675] nova-conductor[52020]: INFO nova.service [-] Starting conductor node (version 0.1.0) [ 456.524855] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] compute_monitors = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.524933] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] config_dir = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.525146] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] config_drive_format = iso9660 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.525293] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] config_file = ['/etc/nova/nova.conf'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.525488] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] config_source = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.525673] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] console_host = devstack {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.525861] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] control_exchange = nova {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.526065] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cpu_allocation_ratio = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.526242] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] daemon = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.526434] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] debug = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.526604] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] default_access_ip_network_name = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.526788] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] default_availability_zone = nova {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.526965] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] default_ephemeral_format = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.527273] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.527454] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] default_schedule_zone = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.527623] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] disk_allocation_ratio = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.527808] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] enable_new_services = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.528100] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] enabled_apis = ['osapi_compute'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.528236] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] enabled_ssl_apis = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.528417] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] flat_injected = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.528574] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] force_config_drive = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.528762] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] force_raw_images = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.528952] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] graceful_shutdown_timeout = 5 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.529148] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] heal_instance_info_cache_interval = 60 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.529589] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] host = devstack {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.529824] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] initial_cpu_allocation_ratio = 4.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.529998] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] initial_disk_allocation_ratio = 1.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.530175] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] initial_ram_allocation_ratio = 1.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.530423] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.530602] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] instance_build_timeout = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.530809] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] instance_delete_interval = 300 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.531011] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] instance_format = [instance: %(uuid)s] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.531223] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] instance_name_template = instance-%08x {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.531418] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] instance_usage_audit = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.531626] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] instance_usage_audit_period = month {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.531806] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.532024] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] instances_path = /opt/stack/data/nova/instances {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.532197] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] internal_service_availability_zone = internal {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.532353] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] key = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.532520] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] live_migration_retry_count = 30 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.532691] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] log_config_append = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.532870] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.533082] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] log_dir = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.533277] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] log_file = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.533376] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] log_options = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.533410] nova-conductor[52019]: DEBUG oslo_db.sqlalchemy.engines [None req-78d7a4ed-428a-49aa-b285-95212fb4e45d None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52019) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 456.533551] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] log_rotate_interval = 1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.533751] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] log_rotate_interval_type = days {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.533920] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] log_rotation_type = none {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.534061] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.534188] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.534357] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.534543] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.534667] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.534883] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] long_rpc_timeout = 1800 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.535048] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] max_concurrent_builds = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.535228] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] max_concurrent_live_migrations = 1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.535384] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] max_concurrent_snapshots = 5 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.535539] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] max_local_block_devices = 3 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.535693] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] max_logfile_count = 30 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.535847] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] max_logfile_size_mb = 200 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.536026] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] maximum_instance_delete_attempts = 5 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.536218] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] metadata_listen = 0.0.0.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.536477] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] metadata_listen_port = 8775 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.536676] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] metadata_workers = 2 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.536839] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] migrate_max_retries = -1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.537008] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] mkisofs_cmd = genisoimage {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.537222] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] my_block_storage_ip = 10.180.1.21 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.537353] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] my_ip = 10.180.1.21 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.537532] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] network_allocate_retries = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.537741] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.537907] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] osapi_compute_listen = 0.0.0.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.538083] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] osapi_compute_listen_port = 8774 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.538246] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] osapi_compute_unique_server_name_scope = {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.538408] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] osapi_compute_workers = 2 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.538565] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] password_length = 12 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.538723] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] periodic_enable = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.538879] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] periodic_fuzzy_delay = 60 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.539049] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] pointer_model = usbtablet {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.539238] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] preallocate_images = none {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.539399] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] publish_errors = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.539523] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] pybasedir = /opt/stack/nova {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.539720] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ram_allocation_ratio = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.539889] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] rate_limit_burst = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.540067] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] rate_limit_except_level = CRITICAL {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.540226] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] rate_limit_interval = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.540379] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] reboot_timeout = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.540532] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] reclaim_instance_interval = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.540682] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] record = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.540856] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] reimage_timeout_per_gb = 20 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.541026] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] report_interval = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.541186] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] rescue_timeout = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.541362] nova-conductor[52020]: DEBUG oslo_db.sqlalchemy.engines [None req-b4753e26-5f89-44bc-ae5e-069dbe6c9c6f None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52020) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 456.541400] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] reserved_host_cpus = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.541519] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] reserved_host_disk_mb = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.541670] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] reserved_host_memory_mb = 512 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.541840] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] reserved_huge_pages = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.541999] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] resize_confirm_window = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.542168] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] resize_fs_using_block_device = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.542323] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] resume_guests_state_on_host_boot = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.542484] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.542662] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] rpc_response_timeout = 60 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.542835] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] run_external_periodic_tasks = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.543023] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] running_deleted_instance_action = reap {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.543199] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] running_deleted_instance_poll_interval = 1800 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.543393] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] running_deleted_instance_timeout = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.543573] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] scheduler_instance_sync_interval = 120 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.543730] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] service_down_time = 60 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.543924] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] servicegroup_driver = db {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.544095] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] shelved_offload_time = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.544253] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] shelved_poll_interval = 3600 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.544446] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] shutdown_timeout = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.544622] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] source_is_ipv6 = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.544782] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ssl_only = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.544974] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] state_path = /opt/stack/data/nova {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.545138] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] sync_power_state_interval = 600 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.545310] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] sync_power_state_pool_size = 1000 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.545473] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] syslog_log_facility = LOG_USER {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.545630] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] tempdir = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.545804] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] timeout_nbd = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.545973] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] transport_url = **** {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.546149] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] update_resources_interval = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.546317] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] use_cow_images = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.546497] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] use_eventlog = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.546655] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] use_journal = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.546810] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] use_json = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.546964] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] use_rootwrap_daemon = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.547152] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] use_stderr = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.547326] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] use_syslog = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.547474] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vcpu_pin_set = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.547640] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vif_plugging_is_fatal = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.547826] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vif_plugging_timeout = 300 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.548036] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] virt_mkfs = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.548203] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] volume_usage_poll_interval = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.548362] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] watch_log_file = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.548574] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] web = /usr/share/spice-html5 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 456.548853] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_concurrency.disable_process_locking = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.549067] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_concurrency.lock_path = /opt/stack/data/nova {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.549280] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.549451] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.549619] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_metrics.metrics_process_name = {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.549818] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.549997] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.550239] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.auth_strategy = keystone {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.550431] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.compute_link_prefix = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.550621] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.550822] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.dhcp_domain = novalocal {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.551029] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.enable_instance_password = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.551198] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.glance_link_prefix = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.551361] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.551544] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.instance_list_cells_batch_strategy = distributed {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.551704] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.instance_list_per_project_cells = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.551864] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.list_records_by_skipping_down_cells = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.552059] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.local_metadata_per_cell = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.552191] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.max_limit = 1000 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.552351] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.metadata_cache_expiration = 15 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.552540] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.neutron_default_tenant_id = default {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.552706] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.use_forwarded_for = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.552869] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.use_neutron_default_nets = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.553041] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.553205] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.vendordata_dynamic_failure_fatal = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.553371] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.553539] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.vendordata_dynamic_ssl_certfile = {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.553710] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.vendordata_dynamic_targets = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.553898] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.vendordata_jsonfile_path = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.554082] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api.vendordata_providers = ['StaticJSON'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.554341] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.backend = dogpile.cache.memcached {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.554526] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.backend_argument = **** {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.554717] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.config_prefix = cache.oslo {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.554904] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.dead_timeout = 60.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.555079] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.debug_cache_backend = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.555258] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.enable_retry_client = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.555416] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.enable_socket_keepalive = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.555580] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.enabled = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.555768] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.expiration_time = 600 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.555963] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.hashclient_retry_attempts = 2 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.555993] nova-conductor[52019]: DEBUG nova.service [None req-78d7a4ed-428a-49aa-b285-95212fb4e45d None None] Creating RPC server for service conductor {{(pid=52019) start /opt/stack/nova/nova/service.py:182}} [ 456.556164] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.hashclient_retry_delay = 1.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.556327] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.memcache_dead_retry = 300 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.556493] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.memcache_password = {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.556659] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.556823] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.557013] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.memcache_pool_maxsize = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.557182] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.memcache_pool_unused_timeout = 60 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.557344] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.memcache_sasl_enabled = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.557512] nova-conductor[52020]: DEBUG nova.service [None req-b4753e26-5f89-44bc-ae5e-069dbe6c9c6f None None] Creating RPC server for service conductor {{(pid=52020) start /opt/stack/nova/nova/service.py:182}} [ 456.557545] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.memcache_servers = ['localhost:11211'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.557677] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.memcache_socket_timeout = 1.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.557840] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.memcache_username = {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.558012] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.proxies = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.558175] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.retry_attempts = 2 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.558333] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.retry_delay = 0.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.558493] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.socket_keepalive_count = 1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.558647] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.socket_keepalive_idle = 1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.558804] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.socket_keepalive_interval = 1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.558957] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.tls_allowed_ciphers = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.559118] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.tls_cafile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.559269] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.tls_certfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.559423] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.tls_enabled = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.559573] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cache.tls_keyfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.559822] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cinder.auth_section = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.560029] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cinder.auth_type = password {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.560218] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cinder.cafile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.560407] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cinder.catalog_info = volumev3::publicURL {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.560600] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cinder.certfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.560790] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cinder.collect_timing = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.560996] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cinder.cross_az_attach = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.561175] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cinder.debug = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.561333] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cinder.endpoint_template = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.561518] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cinder.http_retries = 3 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.561682] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cinder.insecure = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.561836] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cinder.keyfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.562016] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cinder.os_region_name = RegionOne {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.562172] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cinder.split_loggers = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.562328] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cinder.timeout = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.562499] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.562656] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] compute.cpu_dedicated_set = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.562810] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] compute.cpu_shared_set = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.562977] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] compute.image_type_exclude_list = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.563147] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] compute.live_migration_wait_for_vif_plug = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.563327] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] compute.max_concurrent_disk_ops = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.563484] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] compute.max_disk_devices_to_attach = -1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.563663] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.563831] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.563994] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] compute.resource_provider_association_refresh = 300 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.564165] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] compute.shutdown_retry_interval = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.564339] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.564516] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] conductor.workers = 2 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.564691] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] console.allowed_origins = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.564851] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] console.ssl_ciphers = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.565027] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] console.ssl_minimum_version = default {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.565204] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] consoleauth.token_ttl = 600 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.565396] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cyborg.cafile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.565559] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cyborg.certfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.565720] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cyborg.collect_timing = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.565876] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cyborg.connect_retries = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.566059] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cyborg.connect_retry_delay = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.566215] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cyborg.endpoint_override = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.566378] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cyborg.insecure = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.566531] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cyborg.keyfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.566683] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cyborg.max_version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.566838] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cyborg.min_version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.566993] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cyborg.region_name = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.567156] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cyborg.service_name = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.567322] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cyborg.service_type = accelerator {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.567479] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cyborg.split_loggers = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.567653] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cyborg.status_code_retries = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.567823] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cyborg.status_code_retry_delay = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.567981] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cyborg.timeout = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.568171] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.568329] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] cyborg.version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.568518] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] database.backend = sqlalchemy {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.568698] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] database.connection = **** {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.568870] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] database.connection_debug = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.569051] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] database.connection_parameters = {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.569214] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] database.connection_recycle_time = 3600 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.569377] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] database.connection_trace = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.569534] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] database.db_inc_retry_interval = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.569698] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] database.db_max_retries = 20 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.569878] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] database.db_max_retry_interval = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.570051] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] database.db_retry_interval = 1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.570224] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] database.max_overflow = 50 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.570384] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] database.max_pool_size = 5 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.570547] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] database.max_retries = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.570717] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] database.mysql_enable_ndb = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.570944] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] database.mysql_sql_mode = TRADITIONAL {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.571131] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] database.mysql_wsrep_sync_wait = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.571297] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] database.pool_timeout = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.571463] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] database.retry_interval = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.571617] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] database.slave_connection = **** {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.571783] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] database.sqlite_synchronous = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.571946] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] database.use_db_reconnect = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.572134] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api_database.backend = sqlalchemy {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.572309] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api_database.connection = **** {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.572473] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api_database.connection_debug = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.572640] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api_database.connection_parameters = {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.572800] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api_database.connection_recycle_time = 3600 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.572963] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api_database.connection_trace = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.573132] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api_database.db_inc_retry_interval = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.573292] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api_database.db_max_retries = 20 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.573449] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api_database.db_max_retry_interval = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.573607] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api_database.db_retry_interval = 1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.573771] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api_database.max_overflow = 50 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.573931] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api_database.max_pool_size = 5 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.574107] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api_database.max_retries = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.574266] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api_database.mysql_enable_ndb = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.574430] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.574584] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api_database.mysql_wsrep_sync_wait = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.574741] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api_database.pool_timeout = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.574917] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api_database.retry_interval = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.575082] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api_database.slave_connection = **** {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.575243] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] api_database.sqlite_synchronous = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.575433] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] devices.enabled_mdev_types = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.575612] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.575772] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ephemeral_storage_encryption.enabled = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.575936] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ephemeral_storage_encryption.key_size = 512 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.576137] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.api_servers = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.576299] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.cafile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.576462] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.certfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.576621] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.collect_timing = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.576793] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.connect_retries = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.576947] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.connect_retry_delay = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.577132] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.debug = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.577341] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.default_trusted_certificate_ids = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.577502] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.enable_certificate_validation = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.577680] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.enable_rbd_download = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.577842] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.endpoint_override = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.578013] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.insecure = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.578205] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.keyfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.578365] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.max_version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.578520] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.min_version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.578676] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.num_retries = 3 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.578839] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.rbd_ceph_conf = {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.578997] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.rbd_connect_timeout = 5 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.579177] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.rbd_pool = {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.579342] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.rbd_user = {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.579496] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.region_name = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.579646] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.service_name = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.579851] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.service_type = image {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.580026] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.split_loggers = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.580184] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.status_code_retries = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.580339] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.status_code_retry_delay = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.580490] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.timeout = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.580662] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.580826] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.verify_glance_signatures = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.580981] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] glance.version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.581156] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] guestfs.debug = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.581350] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] hyperv.config_drive_cdrom = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.581511] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] hyperv.config_drive_inject_password = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.581669] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.581827] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] hyperv.enable_instance_metrics_collection = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.581999] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] hyperv.enable_remotefx = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.582192] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] hyperv.instances_path_share = {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.582358] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] hyperv.iscsi_initiator_list = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.582516] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] hyperv.limit_cpu_features = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.582674] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.582832] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.582992] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] hyperv.power_state_check_timeframe = 60 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.583161] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] hyperv.power_state_event_polling_interval = 2 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.583339] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.583498] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] hyperv.use_multipath_io = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.583677] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] hyperv.volume_attach_retry_count = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.583842] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] hyperv.volume_attach_retry_interval = 5 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.584007] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] hyperv.vswitch_name = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.584170] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.584334] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] mks.enabled = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.585287] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.585484] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] image_cache.manager_interval = 2400 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.585654] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] image_cache.precache_concurrency = 1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.585697] nova-conductor[52019]: DEBUG nova.service [None req-78d7a4ed-428a-49aa-b285-95212fb4e45d None None] Join ServiceGroup membership for this service conductor {{(pid=52019) start /opt/stack/nova/nova/service.py:199}} [ 456.585827] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] image_cache.remove_unused_base_images = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.585926] nova-conductor[52019]: DEBUG nova.servicegroup.drivers.db [None req-78d7a4ed-428a-49aa-b285-95212fb4e45d None None] DB_Driver: join new ServiceGroup member devstack to the conductor group, service = {{(pid=52019) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 456.586029] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.586207] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.586384] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] image_cache.subdirectory_name = _base {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.586589] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.api_max_retries = 60 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.586753] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.api_retry_interval = 2 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.586913] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.auth_section = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.587088] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.auth_type = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.587246] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.cafile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.587394] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.certfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.587553] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.collect_timing = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.587752] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.connect_retries = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.587912] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.connect_retry_delay = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.588100] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.endpoint_override = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.588262] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.insecure = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.588418] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.keyfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.588575] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.max_version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.588727] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.min_version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.588883] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.partition_key = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.589052] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.peer_list = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.589212] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.region_name = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.589370] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.serial_console_state_timeout = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.589523] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.service_name = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.589696] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.service_type = baremetal {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.589882] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.split_loggers = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.590051] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.status_code_retries = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.590122] nova-conductor[52020]: DEBUG nova.service [None req-b4753e26-5f89-44bc-ae5e-069dbe6c9c6f None None] Join ServiceGroup membership for this service conductor {{(pid=52020) start /opt/stack/nova/nova/service.py:199}} [ 456.590208] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.status_code_retry_delay = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.590360] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.timeout = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.590572] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.591059] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ironic.version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.591059] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.591152] nova-conductor[52020]: DEBUG nova.servicegroup.drivers.db [None req-b4753e26-5f89-44bc-ae5e-069dbe6c9c6f None None] DB_Driver: join new ServiceGroup member devstack to the conductor group, service = {{(pid=52020) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 456.591183] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] key_manager.fixed_key = **** {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.591380] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.591560] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican.barbican_api_version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.591734] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican.barbican_endpoint = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.591928] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican.barbican_endpoint_type = public {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.592120] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican.barbican_region_name = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.592303] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican.cafile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.592460] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican.certfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.592620] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican.collect_timing = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.592774] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican.insecure = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.592927] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican.keyfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.593119] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican.number_of_retries = 60 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.593293] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican.retry_delay = 1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.593471] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican.send_service_user_token = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.593630] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican.split_loggers = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.593782] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican.timeout = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.593937] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican.verify_ssl = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.594100] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican.verify_ssl_path = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.594284] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican_service_user.auth_section = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.594468] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican_service_user.auth_type = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.594625] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican_service_user.cafile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.594779] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican_service_user.certfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.594943] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican_service_user.collect_timing = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.595113] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican_service_user.insecure = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.595270] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican_service_user.keyfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.595424] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican_service_user.split_loggers = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.595577] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] barbican_service_user.timeout = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.595745] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vault.approle_role_id = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.595901] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vault.approle_secret_id = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.596072] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vault.cafile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.596231] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vault.certfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.596388] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vault.collect_timing = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.596544] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vault.insecure = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.596696] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vault.keyfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.596884] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vault.kv_mountpoint = secret {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.597069] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vault.kv_version = 2 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.597233] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vault.namespace = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.597385] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vault.root_token_id = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.597544] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vault.split_loggers = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.597722] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vault.ssl_ca_crt_file = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.597881] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vault.timeout = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.598071] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vault.use_ssl = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.598253] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.598443] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] keystone.cafile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.598617] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] keystone.certfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.598797] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] keystone.collect_timing = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.598975] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] keystone.connect_retries = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.599145] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] keystone.connect_retry_delay = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.599301] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] keystone.endpoint_override = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.599455] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] keystone.insecure = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.599628] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] keystone.keyfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.599811] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] keystone.max_version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.599964] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] keystone.min_version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.600131] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] keystone.region_name = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.600281] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] keystone.service_name = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.600447] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] keystone.service_type = identity {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.600600] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] keystone.split_loggers = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.600749] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] keystone.status_code_retries = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.600901] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] keystone.status_code_retry_delay = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.601065] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] keystone.timeout = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.601238] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.601391] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] keystone.version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.601624] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.connection_uri = {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.601807] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.cpu_mode = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.601970] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.cpu_model_extra_flags = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.602173] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.cpu_models = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.602363] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.cpu_power_governor_high = performance {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.602530] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.cpu_power_governor_low = powersave {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.602689] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.cpu_power_management = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.602878] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.603067] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.device_detach_attempts = 8 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.603229] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.device_detach_timeout = 20 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.603390] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.disk_cachemodes = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.603554] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.disk_prefix = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.603717] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.enabled_perf_events = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.603876] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.file_backed_memory = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.604044] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.gid_maps = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.604202] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.hw_disk_discard = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.604354] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.hw_machine_type = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.604520] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.images_rbd_ceph_conf = {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.604678] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.604841] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.605023] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.images_rbd_glance_store_name = {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.605187] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.images_rbd_pool = rbd {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.605351] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.images_type = default {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.605504] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.images_volume_group = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.605663] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.inject_key = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.605818] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.inject_partition = -2 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.605981] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.inject_password = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.606180] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.iscsi_iface = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.606338] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.iser_use_multipath = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.606495] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.live_migration_bandwidth = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.606653] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.live_migration_completion_timeout = 800 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.606811] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.live_migration_downtime = 500 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.606970] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.live_migration_downtime_delay = 75 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.607138] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.live_migration_downtime_steps = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.607295] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.live_migration_inbound_addr = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.607450] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.live_migration_permit_auto_converge = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.607620] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.live_migration_permit_post_copy = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.607801] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.live_migration_scheme = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.607965] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.live_migration_timeout_action = abort {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.608137] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.live_migration_tunnelled = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.608292] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.live_migration_uri = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.608453] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.live_migration_with_native_tls = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.608609] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.max_queues = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.608766] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.mem_stats_period_seconds = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.608965] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.nfs_mount_options = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.609313] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.nfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.609487] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.num_aoe_discover_tries = 3 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.609650] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.num_iser_scan_tries = 5 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.609853] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.num_memory_encrypted_guests = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.610032] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.num_nvme_discover_tries = 5 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.610200] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.num_pcie_ports = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.610363] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.num_volume_scan_tries = 5 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.610568] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.pmem_namespaces = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.610729] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.quobyte_client_cfg = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.610983] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.quobyte_mount_point_base = /opt/stack/data/nova/mnt {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.611164] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.rbd_connect_timeout = 5 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.611323] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.611478] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.611631] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.rbd_secret_uuid = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.611782] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.rbd_user = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.611945] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.realtime_scheduler_priority = 1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.612124] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.remote_filesystem_transport = ssh {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.612280] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.rescue_image_id = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.612436] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.rescue_kernel_id = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.612586] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.rescue_ramdisk_id = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.612747] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.rng_dev_path = /dev/urandom {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.612900] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.rx_queue_size = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.613068] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.smbfs_mount_options = {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.613295] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.smbfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.613475] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.snapshot_compression = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.613628] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.snapshot_image_format = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.613857] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.614035] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.sparse_logical_volumes = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.614199] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.swtpm_enabled = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.614387] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.swtpm_group = tss {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.614553] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.swtpm_user = tss {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.614721] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.sysinfo_serial = unique {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.614878] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.tx_queue_size = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.615046] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.uid_maps = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.615203] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.use_virtio_for_bridges = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.615367] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.virt_type = kvm {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.615531] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.volume_clear = zero {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.615685] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.volume_clear_size = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.615843] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.volume_use_multipath = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.616028] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.vzstorage_cache_path = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.616194] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.616359] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.vzstorage_mount_group = qemu {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.616518] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.vzstorage_mount_opts = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.616681] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.616895] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/nova/mnt {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.617075] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.vzstorage_mount_user = stack {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.617234] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.617398] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.auth_section = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.617562] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.auth_type = password {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.617736] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.cafile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.617894] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.certfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.618075] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.collect_timing = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.618257] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.connect_retries = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.618409] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.connect_retry_delay = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.618584] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.default_floating_pool = public {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.618732] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.endpoint_override = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.618894] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.extension_sync_interval = 600 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.619076] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.http_retries = 3 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.619252] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.insecure = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.619408] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.keyfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.619579] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.max_version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.619812] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.metadata_proxy_shared_secret = **** {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.619988] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.min_version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.620276] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.ovs_bridge = br-int {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.620447] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.physnets = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.620613] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.region_name = RegionOne {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.620802] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.service_metadata_proxy = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.620939] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.service_name = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.621122] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.service_type = network {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.621285] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.split_loggers = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.621438] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.status_code_retries = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.621592] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.status_code_retry_delay = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.621750] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.timeout = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.621925] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.622112] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] neutron.version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.622293] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] notifications.bdms_in_notifications = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.622477] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] notifications.default_level = INFO {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.622677] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] notifications.notification_format = unversioned {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.622874] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] notifications.notify_on_state_change = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.623095] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.623302] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] pci.alias = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.623472] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] pci.device_spec = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.623638] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] pci.report_in_placement = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.623830] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.auth_section = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.624013] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.auth_type = password {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.624213] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.auth_url = http://10.180.1.21/identity {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.624374] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.cafile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.624530] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.certfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.624691] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.collect_timing = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.624849] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.connect_retries = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.625020] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.connect_retry_delay = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.625178] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.default_domain_id = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.625332] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.default_domain_name = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.625484] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.domain_id = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.625635] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.domain_name = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.625788] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.endpoint_override = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.625948] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.insecure = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.626112] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.keyfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.626265] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.max_version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.626414] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.min_version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.626576] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.password = **** {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.626727] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.project_domain_id = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.626887] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.project_domain_name = Default {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.627057] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.project_id = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.627226] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.project_name = service {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.627391] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.region_name = RegionOne {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.627547] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.service_name = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.627730] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.service_type = placement {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.627897] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.split_loggers = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.628068] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.status_code_retries = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.628226] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.status_code_retry_delay = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.628379] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.system_scope = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.628555] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.timeout = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.628712] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.trust_id = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.628867] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.user_domain_id = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.629040] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.user_domain_name = Default {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.629197] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.user_id = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.629362] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.username = placement {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.629541] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.629731] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] placement.version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.629934] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] quota.cores = 20 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.630148] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] quota.count_usage_from_placement = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.630361] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.630510] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] quota.injected_file_content_bytes = 10240 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.630701] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] quota.injected_file_path_length = 255 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.630834] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] quota.injected_files = 5 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.630991] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] quota.instances = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.631170] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] quota.key_pairs = 100 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.631343] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] quota.metadata_items = 128 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.631537] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] quota.ram = 51200 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.631701] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] quota.recheck_quota = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.631864] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] quota.server_group_members = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.632032] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] quota.server_groups = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.632200] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] rdp.enabled = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.632504] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.632716] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.632942] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.633194] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] scheduler.image_metadata_prefilter = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.633393] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.633573] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] scheduler.max_attempts = 3 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.633756] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] scheduler.max_placement_results = 1000 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.633936] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.634138] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] scheduler.query_placement_for_availability_zone = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.634302] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] scheduler.query_placement_for_image_type_support = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.634490] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.634712] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] scheduler.workers = 2 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.634909] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.635091] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.635288] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.635457] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.635639] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.635800] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.635996] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.636235] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.636406] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.host_subset_size = 1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.636564] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.image_properties_default_architecture = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.636727] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.636888] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.isolated_hosts = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.637063] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.isolated_images = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.637221] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.max_instances_per_host = 50 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.637375] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.637546] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.pci_in_placement = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.637745] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.637913] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.638086] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.638247] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.638404] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.638562] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.638743] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.track_instance_changes = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.638917] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.639135] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] metrics.required = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.639313] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] metrics.weight_multiplier = 1.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.639475] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] metrics.weight_of_unavailable = -10000.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.639636] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] metrics.weight_setting = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.639972] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.640162] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] serial_console.enabled = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.640360] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] serial_console.port_range = 10000:20000 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.640565] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.640772] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.640988] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] serial_console.serialproxy_port = 6083 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.641194] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] service_user.auth_section = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.641366] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] service_user.auth_type = password {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.641527] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] service_user.cafile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.641681] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] service_user.certfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.641842] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] service_user.collect_timing = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.642014] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] service_user.insecure = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.642176] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] service_user.keyfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.642343] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] service_user.send_service_user_token = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.642505] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] service_user.split_loggers = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.642659] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] service_user.timeout = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.642831] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] spice.agent_enabled = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.642994] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] spice.enabled = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.643361] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.643613] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] spice.html5proxy_host = 0.0.0.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.643788] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] spice.html5proxy_port = 6082 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.643955] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] spice.image_compression = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.644127] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] spice.jpeg_compression = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.644284] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] spice.playback_compression = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.644466] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] spice.server_listen = 127.0.0.1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.644624] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.644806] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] spice.streaming_mode = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.644981] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] spice.zlib_compression = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.645161] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] upgrade_levels.baseapi = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.645320] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] upgrade_levels.cert = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.645490] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] upgrade_levels.compute = auto {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.645649] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] upgrade_levels.conductor = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.645807] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] upgrade_levels.scheduler = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.645978] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vendordata_dynamic_auth.auth_section = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.646164] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vendordata_dynamic_auth.auth_type = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.646361] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vendordata_dynamic_auth.cafile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.646538] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vendordata_dynamic_auth.certfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.646707] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vendordata_dynamic_auth.collect_timing = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.646869] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vendordata_dynamic_auth.insecure = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.647034] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vendordata_dynamic_auth.keyfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.647199] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vendordata_dynamic_auth.split_loggers = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.647358] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vendordata_dynamic_auth.timeout = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.647559] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.api_retry_count = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.647758] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.ca_file = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.647962] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.cache_prefix = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.648137] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.cluster_name = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.648297] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.connection_pool_size = 10 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.648450] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.console_delay_seconds = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.648604] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.datastore_regex = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.648761] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.host_ip = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.648915] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.host_password = **** {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.649088] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.host_port = 443 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.649292] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.host_username = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.649511] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.insecure = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.649692] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.integration_bridge = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.649872] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.maximum_objects = 100 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.650044] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.pbm_default_policy = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.650208] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.pbm_enabled = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.650363] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.pbm_wsdl_location = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.650528] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.650683] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.serial_port_proxy_uri = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.650845] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.serial_port_service_uri = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.651051] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.task_poll_interval = 0.5 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.651220] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.use_linked_clone = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.651387] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.vnc_keymap = en-us {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.651548] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.vnc_port = 5900 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.651708] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vmware.vnc_port_total = 10000 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.651923] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vnc.auth_schemes = ['none'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.652120] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vnc.enabled = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.652467] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.652660] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.652837] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vnc.novncproxy_port = 6080 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.653029] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vnc.server_listen = 127.0.0.1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.653208] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.653366] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vnc.vencrypt_ca_certs = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.653524] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vnc.vencrypt_client_cert = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.653678] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] vnc.vencrypt_client_key = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.653900] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.654084] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] workarounds.disable_fallback_pcpu_query = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.654245] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] workarounds.disable_group_policy_check_upcall = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.654402] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.654558] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] workarounds.disable_rootwrap = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.654716] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] workarounds.enable_numa_live_migration = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.654874] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.655040] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.655226] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] workarounds.handle_virt_lifecycle_events = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.655406] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] workarounds.libvirt_disable_apic = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.655566] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] workarounds.never_download_image_if_on_rbd = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.655723] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.655880] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.656047] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.656206] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.656362] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.656545] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.656733] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.656900] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.657097] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.657301] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.657484] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] wsgi.client_socket_timeout = 900 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.657687] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] wsgi.default_pool_size = 1000 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.657872] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] wsgi.keep_alive = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.658060] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] wsgi.max_header_line = 16384 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.658225] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] wsgi.secure_proxy_ssl_header = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.658382] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] wsgi.ssl_ca_file = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.658539] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] wsgi.ssl_cert_file = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.658693] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] wsgi.ssl_key_file = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.658858] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] wsgi.tcp_keepidle = 600 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.659039] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.659205] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] zvm.ca_file = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.659392] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] zvm.cloud_connector_url = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.659632] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] zvm.image_tmp_path = /opt/stack/data/nova/images {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.659832] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] zvm.reachable_timeout = 300 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.660064] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_policy.enforce_new_defaults = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.660255] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_policy.enforce_scope = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.660448] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_policy.policy_default_rule = default {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.660645] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.660862] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_policy.policy_file = policy.yaml {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.661077] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.661258] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.661419] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.661576] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.661734] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.661930] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.662126] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.662378] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] profiler.connection_string = messaging:// {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.662563] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] profiler.enabled = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.662754] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] profiler.es_doc_type = notification {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.662936] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] profiler.es_scroll_size = 10000 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.663114] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] profiler.es_scroll_time = 2m {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.663277] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] profiler.filter_error_trace = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.663440] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] profiler.hmac_keys = SECRET_KEY {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.663603] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] profiler.sentinel_service_name = mymaster {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.663821] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] profiler.socket_timeout = 0.1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.663996] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] profiler.trace_sqlalchemy = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.664204] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] remote_debug.host = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.664385] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] remote_debug.port = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.664572] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.664738] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.664900] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.665075] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.665297] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.665464] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.665630] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.665788] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.665952] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.666123] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.666301] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.666470] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.666668] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.666837] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.666999] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.667522] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.667522] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.667522] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.667710] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.667870] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.668071] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.668240] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.668401] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.668565] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.668725] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.668913] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.ssl = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.669078] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.669250] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.669470] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.669649] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.669848] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_rabbit.ssl_version = {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.670068] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.670240] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_notifications.retry = -1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.670438] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.670608] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_messaging_notifications.transport_url = **** {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.670841] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.auth_section = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.671013] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.auth_type = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.671176] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.cafile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.671330] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.certfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.671492] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.collect_timing = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.671645] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.connect_retries = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.671799] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.connect_retry_delay = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.671954] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.endpoint_id = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.672121] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.endpoint_override = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.672276] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.insecure = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.672428] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.keyfile = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.672580] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.max_version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.672735] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.min_version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.672916] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.region_name = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.673111] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.service_name = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.673267] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.service_type = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.673426] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.split_loggers = False {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.673579] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.status_code_retries = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.673731] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.status_code_retry_delay = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.673886] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.timeout = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.674048] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.valid_interfaces = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.674203] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_limit.version = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.674411] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_reports.file_event_handler = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.674569] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_reports.file_event_handler_interval = 1 {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.674723] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] oslo_reports.log_dir = None {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 456.674851] nova-conductor[51600]: DEBUG oslo_service.service [None req-91b45751-ee2b-4907-99ac-9913b4c0ff9c None None] ******************************************************************************** {{(pid=51600) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 541.106700] nova-conductor[52020]: DEBUG oslo_db.sqlalchemy.engines [None req-c657c30f-e644-49cb-a599-1297549cb48f None None] Parent process 51600 forked (52020) with an open database connection, which is being discarded and recreated. {{(pid=52020) checkout /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:434}} [ 582.356709] nova-conductor[52019]: DEBUG oslo_db.sqlalchemy.engines [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Parent process 51600 forked (52019) with an open database connection, which is being discarded and recreated. {{(pid=52019) checkout /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:434}} [ 583.190108] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Took 0.83 seconds to select destinations for 1 instance(s). {{(pid=52019) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 583.220206] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 583.220491] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 583.222100] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 583.230932] nova-conductor[52019]: DEBUG oslo_db.sqlalchemy.engines [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52019) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 583.313235] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 583.313448] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 583.314482] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 583.315163] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 583.315362] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 583.315771] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 583.325660] nova-conductor[52019]: DEBUG oslo_db.sqlalchemy.engines [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52019) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 583.343028] nova-conductor[52019]: DEBUG nova.quota [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Getting quotas for project 0c870ef0323546079b6471bb30e9eb36. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 583.346750] nova-conductor[52019]: DEBUG nova.quota [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Getting quotas for user cf6c7cdc98f9419e938b071cf3d6217a and project 0c870ef0323546079b6471bb30e9eb36. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 583.352316] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52019) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 583.353060] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 583.353276] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 583.353441] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 583.359237] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 583.359938] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 583.360151] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 583.360314] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 583.391892] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 583.391995] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 583.392165] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 583.392488] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Acquiring lock "compute-rpcapi-router" {{(pid=52019) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 583.392649] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Acquired lock "compute-rpcapi-router" {{(pid=52019) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 583.393247] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c42660fa-a6aa-4539-8e5c-bad25f7b5c02 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 583.393436] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c42660fa-a6aa-4539-8e5c-bad25f7b5c02 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 583.393592] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c42660fa-a6aa-4539-8e5c-bad25f7b5c02 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 583.393975] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c42660fa-a6aa-4539-8e5c-bad25f7b5c02 None None] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 583.394130] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c42660fa-a6aa-4539-8e5c-bad25f7b5c02 None None] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 583.394284] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c42660fa-a6aa-4539-8e5c-bad25f7b5c02 None None] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 583.400616] nova-conductor[52019]: INFO nova.compute.rpcapi [None req-c42660fa-a6aa-4539-8e5c-bad25f7b5c02 None None] Automatically selected compute RPC version 6.2 from minimum service version 66 [ 583.401111] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c42660fa-a6aa-4539-8e5c-bad25f7b5c02 None None] Releasing lock "compute-rpcapi-router" {{(pid=52019) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 584.090949] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Took 0.25 seconds to select destinations for 1 instance(s). {{(pid=52019) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 584.126888] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 584.130020] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.130020] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 584.169209] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 584.169428] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.169591] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 584.169976] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 584.170167] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.170319] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 584.189994] nova-conductor[52019]: DEBUG nova.quota [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Getting quotas for project 940667196dca494b839e5099008c23db. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 584.196015] nova-conductor[52019]: DEBUG nova.quota [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Getting quotas for user 286d530fe18948658ebd2710a36984d0 and project 940667196dca494b839e5099008c23db. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 584.202815] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52019) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 584.203753] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 584.204565] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.204783] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 584.210328] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 584.211086] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 584.211326] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.211518] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 584.252631] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 584.252861] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.253045] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 584.360044] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Took 0.23 seconds to select destinations for 1 instance(s). {{(pid=52019) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 584.390022] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 584.390022] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.390190] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 584.430539] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 584.430539] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.430539] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 584.430856] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 584.430856] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.430973] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 584.449401] nova-conductor[52019]: DEBUG nova.quota [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Getting quotas for project 51248048a2ed4ee1801cec899ba5301b. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 584.452601] nova-conductor[52019]: DEBUG nova.quota [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Getting quotas for user 7a555b6832df4a0bb32b26622abc2f1a and project 51248048a2ed4ee1801cec899ba5301b. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 584.460847] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52019) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 584.462175] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 584.462666] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.462666] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 584.465630] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 584.466285] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 584.466488] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.466651] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 584.494636] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 584.494899] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.495021] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 586.724331] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Took 0.19 seconds to select destinations for 1 instance(s). {{(pid=52020) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 586.748786] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 586.749087] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 586.750751] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 586.758780] nova-conductor[52020]: DEBUG oslo_db.sqlalchemy.engines [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52020) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 586.815772] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 586.815997] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 586.816471] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 586.816853] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 586.817045] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 586.817204] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 586.823119] nova-conductor[52020]: DEBUG oslo_db.sqlalchemy.engines [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52020) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 586.841294] nova-conductor[52020]: DEBUG nova.quota [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Getting quotas for project 9257ab9cebe8414fbc0c992b8c344182. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 586.844579] nova-conductor[52020]: DEBUG nova.quota [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Getting quotas for user c6d6b7e3e1ac4eb39d66cc03a1688972 and project 9257ab9cebe8414fbc0c992b8c344182. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 586.850411] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52020) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 586.851283] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 586.851492] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 586.851651] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 586.857894] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 586.858622] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 586.858819] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 586.858975] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 586.898586] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 586.898873] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 586.899057] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 586.899397] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Acquiring lock "compute-rpcapi-router" {{(pid=52020) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 586.899558] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Acquired lock "compute-rpcapi-router" {{(pid=52020) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 586.900166] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-acf91266-b8aa-4d56-9e41-df36df44c1ce None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 586.900347] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-acf91266-b8aa-4d56-9e41-df36df44c1ce None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 586.900508] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-acf91266-b8aa-4d56-9e41-df36df44c1ce None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 586.900893] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-acf91266-b8aa-4d56-9e41-df36df44c1ce None None] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 586.901100] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-acf91266-b8aa-4d56-9e41-df36df44c1ce None None] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 586.901284] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-acf91266-b8aa-4d56-9e41-df36df44c1ce None None] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 586.913025] nova-conductor[52020]: INFO nova.compute.rpcapi [None req-acf91266-b8aa-4d56-9e41-df36df44c1ce None None] Automatically selected compute RPC version 6.2 from minimum service version 66 [ 586.913025] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-acf91266-b8aa-4d56-9e41-df36df44c1ce None None] Releasing lock "compute-rpcapi-router" {{(pid=52020) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 587.254968] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52019) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 587.272131] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 587.272131] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 587.272131] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 587.329365] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 587.329365] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 587.330834] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 587.330834] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 587.330834] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 587.330834] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 587.372833] nova-conductor[52019]: DEBUG nova.quota [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Getting quotas for project de7e8d74ff79471c9b29bb62d6ca8f7b. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 587.374372] nova-conductor[52019]: DEBUG nova.quota [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Getting quotas for user 9f944574bee047198ea5a8f997006b73 and project de7e8d74ff79471c9b29bb62d6ca8f7b. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 587.380704] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52019) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 587.381376] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 587.382174] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 587.382174] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 587.385026] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 587.385535] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 587.385600] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 587.385760] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 587.407173] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 587.407889] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 587.407889] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 587.467483] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52019) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 587.480844] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 587.480992] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 587.481178] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 587.540501] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 587.540720] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 587.540916] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 587.541292] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 587.541478] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 587.541637] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 587.551213] nova-conductor[52019]: DEBUG nova.quota [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Getting quotas for project d239a4f0ed5b48cf9cd9a334de6f189c. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 587.553105] nova-conductor[52019]: DEBUG nova.quota [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Getting quotas for user 2e0c20ce66e045a5bfdffc27e037327e and project d239a4f0ed5b48cf9cd9a334de6f189c. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 587.558101] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52019) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 587.558567] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 587.559378] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 587.559378] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 587.562014] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 587.562653] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 587.562842] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 587.563423] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 587.580128] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 587.580128] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 587.580263] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 588.069021] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Took 0.23 seconds to select destinations for 1 instance(s). {{(pid=52020) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 588.088113] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.088113] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.088113] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 588.121727] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.123173] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.123173] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 588.123173] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.123173] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.123355] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 588.131875] nova-conductor[52020]: DEBUG nova.quota [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Getting quotas for project 6cc419217094416381972f1ec63d776f. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 588.134487] nova-conductor[52020]: DEBUG nova.quota [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Getting quotas for user ca0ef629a1964c7692635b0864879b8e and project 6cc419217094416381972f1ec63d776f. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 588.140457] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52020) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 588.140906] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.141116] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.141278] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 588.144662] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Took 0.21 seconds to select destinations for 1 instance(s). {{(pid=52020) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 588.147447] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 588.148139] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.148353] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.148520] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 588.163963] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.164209] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.164380] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 588.185905] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.186153] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.186333] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 588.226119] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.226179] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.226343] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 588.226667] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.226886] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.227012] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 588.245195] nova-conductor[52020]: DEBUG nova.quota [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Getting quotas for project b1e8436dbc5d4f26b38c83626def8b09. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 588.250698] nova-conductor[52020]: DEBUG nova.quota [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Getting quotas for user e842af4991e744e48fc9432a7e6429ee and project b1e8436dbc5d4f26b38c83626def8b09. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 588.277439] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52020) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 588.277439] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.277439] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.277439] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 588.292692] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 588.293072] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.293180] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.293530] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 588.344831] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.345075] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.345252] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 591.115075] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52019) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 591.129129] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 591.129483] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 591.129569] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 591.172258] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 591.172561] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 591.172561] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 591.172822] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 591.172953] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 591.173128] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 591.181362] nova-conductor[52019]: DEBUG nova.quota [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Getting quotas for project a62f6b4b95a847bc914323ae8eca38fc. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 591.185906] nova-conductor[52019]: DEBUG nova.quota [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Getting quotas for user 49028902d8e54906824c4a42be504e3d and project a62f6b4b95a847bc914323ae8eca38fc. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 591.189275] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52019) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 591.189784] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 591.189988] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 591.190174] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 591.196988] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 591.197710] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 591.197945] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 591.198134] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 591.211741] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 591.211964] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 591.212190] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.163264] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52019) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 602.186099] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.186952] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.186952] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.226259] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.226487] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.226658] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.227256] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.227256] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.227428] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.238499] nova-conductor[52019]: DEBUG nova.quota [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Getting quotas for project b54ecd325ea04fb58510dbc4b236d0e3. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 602.241693] nova-conductor[52019]: DEBUG nova.quota [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Getting quotas for user cf57234027f34707a730c895bcac8ccd and project b54ecd325ea04fb58510dbc4b236d0e3. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 602.249321] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52019) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 602.249321] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.249321] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.249321] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.252596] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 602.253445] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.253762] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.254046] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.267040] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.267135] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.267369] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.793123] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52020) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 602.804948] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.805196] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.805363] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.881128] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.881128] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.881128] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.881128] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.881345] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.881345] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.891134] nova-conductor[52020]: DEBUG nova.quota [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Getting quotas for project 5ea8a2bfcb214fecb3a7afea860a90da. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 602.893657] nova-conductor[52020]: DEBUG nova.quota [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Getting quotas for user f76000f269bc439a8a2f8293449d33f4 and project 5ea8a2bfcb214fecb3a7afea860a90da. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 602.900502] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52020) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 602.900502] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.900502] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.900667] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.907411] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 602.908120] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.908369] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.908549] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.921865] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.922148] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.922322] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.919978] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52019) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 603.937896] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.938906] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.938906] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.975016] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.976343] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.976343] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.976343] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.976343] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.976518] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.984723] nova-conductor[52019]: DEBUG nova.quota [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Getting quotas for project 6877b048a8b4486bbbc359726a58f5e6. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 603.987031] nova-conductor[52019]: DEBUG nova.quota [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Getting quotas for user 5cffd1ab1a0f45499abb7a5818170152 and project 6877b048a8b4486bbbc359726a58f5e6. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 603.997924] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52019) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 603.998487] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.998715] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.999810] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.001589] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 604.002287] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.002582] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.002667] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.017254] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.017950] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.017950] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 607.203695] nova-conductor[52019]: ERROR nova.conductor.manager [None req-078d50d3-e5f3-4888-bc0a-d9efbfb5db22 tempest-VolumesAssistedSnapshotsTest-1277997804 tempest-VolumesAssistedSnapshotsTest-1277997804-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 607.203695] nova-conductor[52019]: Traceback (most recent call last): [ 607.203695] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 607.203695] nova-conductor[52019]: return func(*args, **kwargs) [ 607.203695] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 607.203695] nova-conductor[52019]: selections = self._select_destinations( [ 607.203695] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 607.203695] nova-conductor[52019]: selections = self._schedule( [ 607.203695] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 607.203695] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 607.203695] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 607.203695] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 607.203695] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 607.203695] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 607.203695] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 607.203695] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 607.203695] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 607.203695] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 607.203695] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 607.203695] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 607.203695] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 607.204530] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 607.204530] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 607.204530] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 607.204530] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 607.204530] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 607.204530] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 607.204530] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 607.204530] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 607.204530] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 607.204530] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 607.204530] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 607.204530] nova-conductor[52019]: ERROR nova.conductor.manager [ 607.204530] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 607.204530] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 607.204530] nova-conductor[52019]: ERROR nova.conductor.manager [ 607.204530] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 607.204530] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 607.204530] nova-conductor[52019]: ERROR nova.conductor.manager [ 607.204530] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 607.204530] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 607.204530] nova-conductor[52019]: ERROR nova.conductor.manager [ 607.205150] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 607.205150] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 607.205150] nova-conductor[52019]: ERROR nova.conductor.manager [ 607.205150] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 607.205150] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 607.205150] nova-conductor[52019]: ERROR nova.conductor.manager [ 607.205150] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 607.205150] nova-conductor[52019]: ERROR nova.conductor.manager [ 607.205150] nova-conductor[52019]: ERROR nova.conductor.manager [ 607.215885] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-078d50d3-e5f3-4888-bc0a-d9efbfb5db22 tempest-VolumesAssistedSnapshotsTest-1277997804 tempest-VolumesAssistedSnapshotsTest-1277997804-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 607.217814] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-078d50d3-e5f3-4888-bc0a-d9efbfb5db22 tempest-VolumesAssistedSnapshotsTest-1277997804 tempest-VolumesAssistedSnapshotsTest-1277997804-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 607.217814] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-078d50d3-e5f3-4888-bc0a-d9efbfb5db22 tempest-VolumesAssistedSnapshotsTest-1277997804 tempest-VolumesAssistedSnapshotsTest-1277997804-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 607.278735] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-078d50d3-e5f3-4888-bc0a-d9efbfb5db22 tempest-VolumesAssistedSnapshotsTest-1277997804 tempest-VolumesAssistedSnapshotsTest-1277997804-project-member] [instance: 3be005c8-8cbe-4c3a-9478-840167b99e97] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 607.278735] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-078d50d3-e5f3-4888-bc0a-d9efbfb5db22 tempest-VolumesAssistedSnapshotsTest-1277997804 tempest-VolumesAssistedSnapshotsTest-1277997804-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 607.278735] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-078d50d3-e5f3-4888-bc0a-d9efbfb5db22 tempest-VolumesAssistedSnapshotsTest-1277997804 tempest-VolumesAssistedSnapshotsTest-1277997804-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 607.279675] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-078d50d3-e5f3-4888-bc0a-d9efbfb5db22 tempest-VolumesAssistedSnapshotsTest-1277997804 tempest-VolumesAssistedSnapshotsTest-1277997804-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 607.284464] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-078d50d3-e5f3-4888-bc0a-d9efbfb5db22 tempest-VolumesAssistedSnapshotsTest-1277997804 tempest-VolumesAssistedSnapshotsTest-1277997804-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 607.284464] nova-conductor[52019]: Traceback (most recent call last): [ 607.284464] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 607.284464] nova-conductor[52019]: return func(*args, **kwargs) [ 607.284464] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 607.284464] nova-conductor[52019]: selections = self._select_destinations( [ 607.284464] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 607.284464] nova-conductor[52019]: selections = self._schedule( [ 607.284464] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 607.284464] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 607.284464] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 607.284464] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 607.284464] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 607.284464] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 607.285831] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-078d50d3-e5f3-4888-bc0a-d9efbfb5db22 tempest-VolumesAssistedSnapshotsTest-1277997804 tempest-VolumesAssistedSnapshotsTest-1277997804-project-member] [instance: 3be005c8-8cbe-4c3a-9478-840167b99e97] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 608.681753] nova-conductor[52020]: ERROR nova.conductor.manager [None req-f5fbf8fe-f75c-454a-bd25-2b6c63d2a371 tempest-ImagesNegativeTestJSON-423402721 tempest-ImagesNegativeTestJSON-423402721-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 608.681753] nova-conductor[52020]: Traceback (most recent call last): [ 608.681753] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 608.681753] nova-conductor[52020]: return func(*args, **kwargs) [ 608.681753] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 608.681753] nova-conductor[52020]: selections = self._select_destinations( [ 608.681753] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 608.681753] nova-conductor[52020]: selections = self._schedule( [ 608.681753] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 608.681753] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 608.681753] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 608.681753] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 608.681753] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 608.681753] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 608.681753] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 608.681753] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 608.681753] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 608.681753] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 608.681753] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 608.681753] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 608.681753] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 608.682550] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 608.682550] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 608.682550] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 608.682550] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 608.682550] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 608.682550] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 608.682550] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 608.682550] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 608.682550] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 608.682550] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 608.682550] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 608.682550] nova-conductor[52020]: ERROR nova.conductor.manager [ 608.682550] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 608.682550] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 608.682550] nova-conductor[52020]: ERROR nova.conductor.manager [ 608.682550] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 608.682550] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 608.682550] nova-conductor[52020]: ERROR nova.conductor.manager [ 608.682550] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 608.682550] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 608.682550] nova-conductor[52020]: ERROR nova.conductor.manager [ 608.683103] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 608.683103] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 608.683103] nova-conductor[52020]: ERROR nova.conductor.manager [ 608.683103] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 608.683103] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 608.683103] nova-conductor[52020]: ERROR nova.conductor.manager [ 608.683103] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 608.683103] nova-conductor[52020]: ERROR nova.conductor.manager [ 608.683103] nova-conductor[52020]: ERROR nova.conductor.manager [ 608.694208] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-f5fbf8fe-f75c-454a-bd25-2b6c63d2a371 tempest-ImagesNegativeTestJSON-423402721 tempest-ImagesNegativeTestJSON-423402721-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.694208] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-f5fbf8fe-f75c-454a-bd25-2b6c63d2a371 tempest-ImagesNegativeTestJSON-423402721 tempest-ImagesNegativeTestJSON-423402721-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.694208] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-f5fbf8fe-f75c-454a-bd25-2b6c63d2a371 tempest-ImagesNegativeTestJSON-423402721 tempest-ImagesNegativeTestJSON-423402721-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.748921] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-f5fbf8fe-f75c-454a-bd25-2b6c63d2a371 tempest-ImagesNegativeTestJSON-423402721 tempest-ImagesNegativeTestJSON-423402721-project-member] [instance: 6674867a-80c3-4e83-9b1e-134dc6339f3c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 608.749666] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-f5fbf8fe-f75c-454a-bd25-2b6c63d2a371 tempest-ImagesNegativeTestJSON-423402721 tempest-ImagesNegativeTestJSON-423402721-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.749956] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-f5fbf8fe-f75c-454a-bd25-2b6c63d2a371 tempest-ImagesNegativeTestJSON-423402721 tempest-ImagesNegativeTestJSON-423402721-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.750151] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-f5fbf8fe-f75c-454a-bd25-2b6c63d2a371 tempest-ImagesNegativeTestJSON-423402721 tempest-ImagesNegativeTestJSON-423402721-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.755012] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-f5fbf8fe-f75c-454a-bd25-2b6c63d2a371 tempest-ImagesNegativeTestJSON-423402721 tempest-ImagesNegativeTestJSON-423402721-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 608.755012] nova-conductor[52020]: Traceback (most recent call last): [ 608.755012] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 608.755012] nova-conductor[52020]: return func(*args, **kwargs) [ 608.755012] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 608.755012] nova-conductor[52020]: selections = self._select_destinations( [ 608.755012] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 608.755012] nova-conductor[52020]: selections = self._schedule( [ 608.755012] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 608.755012] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 608.755012] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 608.755012] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 608.755012] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 608.755012] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 608.756145] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-f5fbf8fe-f75c-454a-bd25-2b6c63d2a371 tempest-ImagesNegativeTestJSON-423402721 tempest-ImagesNegativeTestJSON-423402721-project-member] [instance: 6674867a-80c3-4e83-9b1e-134dc6339f3c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 613.757067] nova-conductor[52019]: ERROR nova.conductor.manager [None req-1b47de7d-1f62-4d90-bc2a-a2badc56b801 tempest-ServersAdminTestJSON-401499959 tempest-ServersAdminTestJSON-401499959-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 613.757067] nova-conductor[52019]: Traceback (most recent call last): [ 613.757067] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 613.757067] nova-conductor[52019]: return func(*args, **kwargs) [ 613.757067] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 613.757067] nova-conductor[52019]: selections = self._select_destinations( [ 613.757067] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 613.757067] nova-conductor[52019]: selections = self._schedule( [ 613.757067] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 613.757067] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 613.757067] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 613.757067] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 613.757067] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 613.757067] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 613.757067] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 613.757067] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 613.757067] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 613.757067] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 613.757067] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 613.757067] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 613.757067] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 613.758617] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 613.758617] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 613.758617] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 613.758617] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 613.758617] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 613.758617] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 613.758617] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 613.758617] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 613.758617] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 613.758617] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 613.758617] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 613.758617] nova-conductor[52019]: ERROR nova.conductor.manager [ 613.758617] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 613.758617] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 613.758617] nova-conductor[52019]: ERROR nova.conductor.manager [ 613.758617] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 613.758617] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 613.758617] nova-conductor[52019]: ERROR nova.conductor.manager [ 613.758617] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 613.758617] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 613.758617] nova-conductor[52019]: ERROR nova.conductor.manager [ 613.759246] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 613.759246] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 613.759246] nova-conductor[52019]: ERROR nova.conductor.manager [ 613.759246] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 613.759246] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 613.759246] nova-conductor[52019]: ERROR nova.conductor.manager [ 613.759246] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 613.759246] nova-conductor[52019]: ERROR nova.conductor.manager [ 613.759246] nova-conductor[52019]: ERROR nova.conductor.manager [ 613.766653] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1b47de7d-1f62-4d90-bc2a-a2badc56b801 tempest-ServersAdminTestJSON-401499959 tempest-ServersAdminTestJSON-401499959-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.766901] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1b47de7d-1f62-4d90-bc2a-a2badc56b801 tempest-ServersAdminTestJSON-401499959 tempest-ServersAdminTestJSON-401499959-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.767092] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1b47de7d-1f62-4d90-bc2a-a2badc56b801 tempest-ServersAdminTestJSON-401499959 tempest-ServersAdminTestJSON-401499959-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.839278] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-1b47de7d-1f62-4d90-bc2a-a2badc56b801 tempest-ServersAdminTestJSON-401499959 tempest-ServersAdminTestJSON-401499959-project-member] [instance: e8a4d12f-e119-4b71-ac84-e7f8c09aa75f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 613.840140] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1b47de7d-1f62-4d90-bc2a-a2badc56b801 tempest-ServersAdminTestJSON-401499959 tempest-ServersAdminTestJSON-401499959-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.840381] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1b47de7d-1f62-4d90-bc2a-a2badc56b801 tempest-ServersAdminTestJSON-401499959 tempest-ServersAdminTestJSON-401499959-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.842284] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1b47de7d-1f62-4d90-bc2a-a2badc56b801 tempest-ServersAdminTestJSON-401499959 tempest-ServersAdminTestJSON-401499959-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.845041] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-1b47de7d-1f62-4d90-bc2a-a2badc56b801 tempest-ServersAdminTestJSON-401499959 tempest-ServersAdminTestJSON-401499959-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 613.845041] nova-conductor[52019]: Traceback (most recent call last): [ 613.845041] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 613.845041] nova-conductor[52019]: return func(*args, **kwargs) [ 613.845041] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 613.845041] nova-conductor[52019]: selections = self._select_destinations( [ 613.845041] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 613.845041] nova-conductor[52019]: selections = self._schedule( [ 613.845041] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 613.845041] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 613.845041] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 613.845041] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 613.845041] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 613.845041] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 613.845559] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-1b47de7d-1f62-4d90-bc2a-a2badc56b801 tempest-ServersAdminTestJSON-401499959 tempest-ServersAdminTestJSON-401499959-project-member] [instance: e8a4d12f-e119-4b71-ac84-e7f8c09aa75f] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 614.420656] nova-conductor[52020]: ERROR nova.conductor.manager [None req-7b4071a1-2eaa-43ff-81b7-b9a91078cf46 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 614.420656] nova-conductor[52020]: Traceback (most recent call last): [ 614.420656] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 614.420656] nova-conductor[52020]: return func(*args, **kwargs) [ 614.420656] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 614.420656] nova-conductor[52020]: selections = self._select_destinations( [ 614.420656] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 614.420656] nova-conductor[52020]: selections = self._schedule( [ 614.420656] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 614.420656] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 614.420656] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 614.420656] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 614.420656] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 614.420656] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 614.420656] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 614.420656] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 614.420656] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 614.420656] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 614.420656] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 614.420656] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 614.420656] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 614.421480] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 614.421480] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 614.421480] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 614.421480] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 614.421480] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 614.421480] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 614.421480] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 614.421480] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 614.421480] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 614.421480] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 614.421480] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 614.421480] nova-conductor[52020]: ERROR nova.conductor.manager [ 614.421480] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 614.421480] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 614.421480] nova-conductor[52020]: ERROR nova.conductor.manager [ 614.421480] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 614.421480] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 614.421480] nova-conductor[52020]: ERROR nova.conductor.manager [ 614.421480] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 614.421480] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 614.421480] nova-conductor[52020]: ERROR nova.conductor.manager [ 614.422354] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 614.422354] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 614.422354] nova-conductor[52020]: ERROR nova.conductor.manager [ 614.422354] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 614.422354] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 614.422354] nova-conductor[52020]: ERROR nova.conductor.manager [ 614.422354] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 614.422354] nova-conductor[52020]: ERROR nova.conductor.manager [ 614.422354] nova-conductor[52020]: ERROR nova.conductor.manager [ 614.432824] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7b4071a1-2eaa-43ff-81b7-b9a91078cf46 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 614.433760] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7b4071a1-2eaa-43ff-81b7-b9a91078cf46 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 614.433760] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7b4071a1-2eaa-43ff-81b7-b9a91078cf46 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 614.509779] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-7b4071a1-2eaa-43ff-81b7-b9a91078cf46 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] [instance: b9cda1b4-4e6e-488e-a43f-9f37ae2feee5] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 614.509779] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7b4071a1-2eaa-43ff-81b7-b9a91078cf46 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 614.509779] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7b4071a1-2eaa-43ff-81b7-b9a91078cf46 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 614.510399] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7b4071a1-2eaa-43ff-81b7-b9a91078cf46 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 614.513770] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-7b4071a1-2eaa-43ff-81b7-b9a91078cf46 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 614.513770] nova-conductor[52020]: Traceback (most recent call last): [ 614.513770] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 614.513770] nova-conductor[52020]: return func(*args, **kwargs) [ 614.513770] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 614.513770] nova-conductor[52020]: selections = self._select_destinations( [ 614.513770] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 614.513770] nova-conductor[52020]: selections = self._schedule( [ 614.513770] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 614.513770] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 614.513770] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 614.513770] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 614.513770] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 614.513770] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 614.514854] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-7b4071a1-2eaa-43ff-81b7-b9a91078cf46 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] [instance: b9cda1b4-4e6e-488e-a43f-9f37ae2feee5] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 615.458533] nova-conductor[52019]: ERROR nova.conductor.manager [None req-835b0354-fd3d-4cf5-ac66-a9ba15ebe788 tempest-ServersWithSpecificFlavorTestJSON-1898801755 tempest-ServersWithSpecificFlavorTestJSON-1898801755-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 615.458533] nova-conductor[52019]: Traceback (most recent call last): [ 615.458533] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 615.458533] nova-conductor[52019]: return func(*args, **kwargs) [ 615.458533] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 615.458533] nova-conductor[52019]: selections = self._select_destinations( [ 615.458533] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 615.458533] nova-conductor[52019]: selections = self._schedule( [ 615.458533] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 615.458533] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 615.458533] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 615.458533] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 615.458533] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 615.458533] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 615.458533] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 615.458533] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 615.458533] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 615.458533] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 615.458533] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 615.458533] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 615.459293] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 615.459293] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 615.459293] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 615.459293] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 615.459293] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 615.459293] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 615.459293] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 615.459293] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 615.459293] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 615.459293] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 615.459293] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 615.459293] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 615.459293] nova-conductor[52019]: ERROR nova.conductor.manager [ 615.459293] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 615.459293] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 615.459293] nova-conductor[52019]: ERROR nova.conductor.manager [ 615.459293] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 615.459293] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 615.459293] nova-conductor[52019]: ERROR nova.conductor.manager [ 615.460050] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 615.460050] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 615.460050] nova-conductor[52019]: ERROR nova.conductor.manager [ 615.460050] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 615.460050] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 615.460050] nova-conductor[52019]: ERROR nova.conductor.manager [ 615.460050] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 615.460050] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 615.460050] nova-conductor[52019]: ERROR nova.conductor.manager [ 615.460050] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 615.460050] nova-conductor[52019]: ERROR nova.conductor.manager [ 615.460050] nova-conductor[52019]: ERROR nova.conductor.manager [ 615.465813] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-835b0354-fd3d-4cf5-ac66-a9ba15ebe788 tempest-ServersWithSpecificFlavorTestJSON-1898801755 tempest-ServersWithSpecificFlavorTestJSON-1898801755-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.467031] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-835b0354-fd3d-4cf5-ac66-a9ba15ebe788 tempest-ServersWithSpecificFlavorTestJSON-1898801755 tempest-ServersWithSpecificFlavorTestJSON-1898801755-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.467031] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-835b0354-fd3d-4cf5-ac66-a9ba15ebe788 tempest-ServersWithSpecificFlavorTestJSON-1898801755 tempest-ServersWithSpecificFlavorTestJSON-1898801755-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 615.527678] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-835b0354-fd3d-4cf5-ac66-a9ba15ebe788 tempest-ServersWithSpecificFlavorTestJSON-1898801755 tempest-ServersWithSpecificFlavorTestJSON-1898801755-project-member] [instance: d5f83b3a-36b2-4c83-9858-0a74e48e00a3] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 615.528504] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-835b0354-fd3d-4cf5-ac66-a9ba15ebe788 tempest-ServersWithSpecificFlavorTestJSON-1898801755 tempest-ServersWithSpecificFlavorTestJSON-1898801755-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.529734] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-835b0354-fd3d-4cf5-ac66-a9ba15ebe788 tempest-ServersWithSpecificFlavorTestJSON-1898801755 tempest-ServersWithSpecificFlavorTestJSON-1898801755-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.529734] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-835b0354-fd3d-4cf5-ac66-a9ba15ebe788 tempest-ServersWithSpecificFlavorTestJSON-1898801755 tempest-ServersWithSpecificFlavorTestJSON-1898801755-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 615.533062] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-835b0354-fd3d-4cf5-ac66-a9ba15ebe788 tempest-ServersWithSpecificFlavorTestJSON-1898801755 tempest-ServersWithSpecificFlavorTestJSON-1898801755-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 615.533062] nova-conductor[52019]: Traceback (most recent call last): [ 615.533062] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 615.533062] nova-conductor[52019]: return func(*args, **kwargs) [ 615.533062] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 615.533062] nova-conductor[52019]: selections = self._select_destinations( [ 615.533062] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 615.533062] nova-conductor[52019]: selections = self._schedule( [ 615.533062] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 615.533062] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 615.533062] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 615.533062] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 615.533062] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 615.533062] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 615.534278] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-835b0354-fd3d-4cf5-ac66-a9ba15ebe788 tempest-ServersWithSpecificFlavorTestJSON-1898801755 tempest-ServersWithSpecificFlavorTestJSON-1898801755-project-member] [instance: d5f83b3a-36b2-4c83-9858-0a74e48e00a3] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.000130] nova-conductor[52020]: ERROR nova.conductor.manager [None req-c16939da-41f3-4d0c-af99-036342e57d61 tempest-ServersAdminTestJSON-401499959 tempest-ServersAdminTestJSON-401499959-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.000130] nova-conductor[52020]: Traceback (most recent call last): [ 617.000130] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 617.000130] nova-conductor[52020]: return func(*args, **kwargs) [ 617.000130] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 617.000130] nova-conductor[52020]: selections = self._select_destinations( [ 617.000130] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 617.000130] nova-conductor[52020]: selections = self._schedule( [ 617.000130] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 617.000130] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 617.000130] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 617.000130] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 617.000130] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 617.000130] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 617.000130] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 617.000130] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 617.000130] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 617.000130] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 617.000130] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 617.000130] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 617.000130] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 617.000883] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 617.000883] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 617.000883] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 617.000883] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 617.000883] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 617.000883] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 617.000883] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 617.000883] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 617.000883] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 617.000883] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.000883] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 617.000883] nova-conductor[52020]: ERROR nova.conductor.manager [ 617.000883] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 617.000883] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 617.000883] nova-conductor[52020]: ERROR nova.conductor.manager [ 617.000883] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 617.000883] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 617.000883] nova-conductor[52020]: ERROR nova.conductor.manager [ 617.000883] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 617.000883] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 617.000883] nova-conductor[52020]: ERROR nova.conductor.manager [ 617.001417] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 617.001417] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 617.001417] nova-conductor[52020]: ERROR nova.conductor.manager [ 617.001417] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 617.001417] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 617.001417] nova-conductor[52020]: ERROR nova.conductor.manager [ 617.001417] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 617.001417] nova-conductor[52020]: ERROR nova.conductor.manager [ 617.001417] nova-conductor[52020]: ERROR nova.conductor.manager [ 617.029720] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-c16939da-41f3-4d0c-af99-036342e57d61 tempest-ServersAdminTestJSON-401499959 tempest-ServersAdminTestJSON-401499959-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.029720] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-c16939da-41f3-4d0c-af99-036342e57d61 tempest-ServersAdminTestJSON-401499959 tempest-ServersAdminTestJSON-401499959-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.029720] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-c16939da-41f3-4d0c-af99-036342e57d61 tempest-ServersAdminTestJSON-401499959 tempest-ServersAdminTestJSON-401499959-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.066138] nova-conductor[52019]: ERROR nova.conductor.manager [None req-7edc95a2-4abe-49a8-82d4-276b8bcad543 tempest-InstanceActionsV221TestJSON-1281874576 tempest-InstanceActionsV221TestJSON-1281874576-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.066138] nova-conductor[52019]: Traceback (most recent call last): [ 617.066138] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 617.066138] nova-conductor[52019]: return func(*args, **kwargs) [ 617.066138] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 617.066138] nova-conductor[52019]: selections = self._select_destinations( [ 617.066138] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 617.066138] nova-conductor[52019]: selections = self._schedule( [ 617.066138] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 617.066138] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 617.066138] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 617.066138] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 617.066138] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 617.066138] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 617.066138] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 617.066138] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 617.066138] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 617.066138] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 617.066138] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 617.066138] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 617.066138] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 617.066909] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 617.066909] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 617.066909] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 617.066909] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 617.066909] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 617.066909] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 617.066909] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 617.066909] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 617.066909] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 617.066909] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.066909] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 617.066909] nova-conductor[52019]: ERROR nova.conductor.manager [ 617.066909] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 617.066909] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 617.066909] nova-conductor[52019]: ERROR nova.conductor.manager [ 617.066909] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 617.066909] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 617.066909] nova-conductor[52019]: ERROR nova.conductor.manager [ 617.066909] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 617.066909] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 617.066909] nova-conductor[52019]: ERROR nova.conductor.manager [ 617.067481] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 617.067481] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 617.067481] nova-conductor[52019]: ERROR nova.conductor.manager [ 617.067481] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 617.067481] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 617.067481] nova-conductor[52019]: ERROR nova.conductor.manager [ 617.067481] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 617.067481] nova-conductor[52019]: ERROR nova.conductor.manager [ 617.067481] nova-conductor[52019]: ERROR nova.conductor.manager [ 617.080350] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7edc95a2-4abe-49a8-82d4-276b8bcad543 tempest-InstanceActionsV221TestJSON-1281874576 tempest-InstanceActionsV221TestJSON-1281874576-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.080350] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7edc95a2-4abe-49a8-82d4-276b8bcad543 tempest-InstanceActionsV221TestJSON-1281874576 tempest-InstanceActionsV221TestJSON-1281874576-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.003s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.080350] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7edc95a2-4abe-49a8-82d4-276b8bcad543 tempest-InstanceActionsV221TestJSON-1281874576 tempest-InstanceActionsV221TestJSON-1281874576-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.111087] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-c16939da-41f3-4d0c-af99-036342e57d61 tempest-ServersAdminTestJSON-401499959 tempest-ServersAdminTestJSON-401499959-project-member] [instance: 5b188e02-5748-47ae-8e07-947bc1b751b6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 617.113289] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-c16939da-41f3-4d0c-af99-036342e57d61 tempest-ServersAdminTestJSON-401499959 tempest-ServersAdminTestJSON-401499959-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.113289] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-c16939da-41f3-4d0c-af99-036342e57d61 tempest-ServersAdminTestJSON-401499959 tempest-ServersAdminTestJSON-401499959-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.113289] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-c16939da-41f3-4d0c-af99-036342e57d61 tempest-ServersAdminTestJSON-401499959 tempest-ServersAdminTestJSON-401499959-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.120954] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-c16939da-41f3-4d0c-af99-036342e57d61 tempest-ServersAdminTestJSON-401499959 tempest-ServersAdminTestJSON-401499959-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 617.120954] nova-conductor[52020]: Traceback (most recent call last): [ 617.120954] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 617.120954] nova-conductor[52020]: return func(*args, **kwargs) [ 617.120954] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 617.120954] nova-conductor[52020]: selections = self._select_destinations( [ 617.120954] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 617.120954] nova-conductor[52020]: selections = self._schedule( [ 617.120954] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 617.120954] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 617.120954] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 617.120954] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 617.120954] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 617.120954] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.122707] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-c16939da-41f3-4d0c-af99-036342e57d61 tempest-ServersAdminTestJSON-401499959 tempest-ServersAdminTestJSON-401499959-project-member] [instance: 5b188e02-5748-47ae-8e07-947bc1b751b6] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.160824] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-7edc95a2-4abe-49a8-82d4-276b8bcad543 tempest-InstanceActionsV221TestJSON-1281874576 tempest-InstanceActionsV221TestJSON-1281874576-project-member] [instance: 784b2226-7bd1-464e-b067-3e1a8ede0380] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 617.162405] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7edc95a2-4abe-49a8-82d4-276b8bcad543 tempest-InstanceActionsV221TestJSON-1281874576 tempest-InstanceActionsV221TestJSON-1281874576-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.162782] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7edc95a2-4abe-49a8-82d4-276b8bcad543 tempest-InstanceActionsV221TestJSON-1281874576 tempest-InstanceActionsV221TestJSON-1281874576-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.163388] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-7edc95a2-4abe-49a8-82d4-276b8bcad543 tempest-InstanceActionsV221TestJSON-1281874576 tempest-InstanceActionsV221TestJSON-1281874576-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.172125] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-7edc95a2-4abe-49a8-82d4-276b8bcad543 tempest-InstanceActionsV221TestJSON-1281874576 tempest-InstanceActionsV221TestJSON-1281874576-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 617.172125] nova-conductor[52019]: Traceback (most recent call last): [ 617.172125] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 617.172125] nova-conductor[52019]: return func(*args, **kwargs) [ 617.172125] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 617.172125] nova-conductor[52019]: selections = self._select_destinations( [ 617.172125] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 617.172125] nova-conductor[52019]: selections = self._schedule( [ 617.172125] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 617.172125] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 617.172125] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 617.172125] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 617.172125] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 617.172125] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.172125] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-7edc95a2-4abe-49a8-82d4-276b8bcad543 tempest-InstanceActionsV221TestJSON-1281874576 tempest-InstanceActionsV221TestJSON-1281874576-project-member] [instance: 784b2226-7bd1-464e-b067-3e1a8ede0380] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 620.693810] nova-conductor[52020]: ERROR nova.conductor.manager [None req-b0eec7d6-ef58-47fc-8e61-ba5f7e11fb19 tempest-ListImageFiltersTestJSON-2121114852 tempest-ListImageFiltersTestJSON-2121114852-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 620.693810] nova-conductor[52020]: Traceback (most recent call last): [ 620.693810] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 620.693810] nova-conductor[52020]: return func(*args, **kwargs) [ 620.693810] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 620.693810] nova-conductor[52020]: selections = self._select_destinations( [ 620.693810] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 620.693810] nova-conductor[52020]: selections = self._schedule( [ 620.693810] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 620.693810] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 620.693810] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 620.693810] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 620.693810] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 620.693810] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 620.693810] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 620.693810] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 620.693810] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 620.693810] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 620.693810] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 620.693810] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 620.693810] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 620.694690] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 620.694690] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 620.694690] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 620.694690] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 620.694690] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 620.694690] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 620.694690] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 620.694690] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 620.694690] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 620.694690] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 620.694690] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 620.694690] nova-conductor[52020]: ERROR nova.conductor.manager [ 620.694690] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 620.694690] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 620.694690] nova-conductor[52020]: ERROR nova.conductor.manager [ 620.694690] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 620.694690] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 620.694690] nova-conductor[52020]: ERROR nova.conductor.manager [ 620.694690] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 620.694690] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 620.694690] nova-conductor[52020]: ERROR nova.conductor.manager [ 620.695316] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 620.695316] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 620.695316] nova-conductor[52020]: ERROR nova.conductor.manager [ 620.695316] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 620.695316] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 620.695316] nova-conductor[52020]: ERROR nova.conductor.manager [ 620.695316] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 620.695316] nova-conductor[52020]: ERROR nova.conductor.manager [ 620.695316] nova-conductor[52020]: ERROR nova.conductor.manager [ 620.702296] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b0eec7d6-ef58-47fc-8e61-ba5f7e11fb19 tempest-ListImageFiltersTestJSON-2121114852 tempest-ListImageFiltersTestJSON-2121114852-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 620.702534] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b0eec7d6-ef58-47fc-8e61-ba5f7e11fb19 tempest-ListImageFiltersTestJSON-2121114852 tempest-ListImageFiltersTestJSON-2121114852-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 620.702706] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b0eec7d6-ef58-47fc-8e61-ba5f7e11fb19 tempest-ListImageFiltersTestJSON-2121114852 tempest-ListImageFiltersTestJSON-2121114852-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 620.752295] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-b0eec7d6-ef58-47fc-8e61-ba5f7e11fb19 tempest-ListImageFiltersTestJSON-2121114852 tempest-ListImageFiltersTestJSON-2121114852-project-member] [instance: 4fc66c3a-ce44-40b1-a7c5-a55ac6ae2e2e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 620.752295] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b0eec7d6-ef58-47fc-8e61-ba5f7e11fb19 tempest-ListImageFiltersTestJSON-2121114852 tempest-ListImageFiltersTestJSON-2121114852-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 620.752295] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b0eec7d6-ef58-47fc-8e61-ba5f7e11fb19 tempest-ListImageFiltersTestJSON-2121114852 tempest-ListImageFiltersTestJSON-2121114852-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 620.752433] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b0eec7d6-ef58-47fc-8e61-ba5f7e11fb19 tempest-ListImageFiltersTestJSON-2121114852 tempest-ListImageFiltersTestJSON-2121114852-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 620.755687] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-b0eec7d6-ef58-47fc-8e61-ba5f7e11fb19 tempest-ListImageFiltersTestJSON-2121114852 tempest-ListImageFiltersTestJSON-2121114852-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 620.755687] nova-conductor[52020]: Traceback (most recent call last): [ 620.755687] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 620.755687] nova-conductor[52020]: return func(*args, **kwargs) [ 620.755687] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 620.755687] nova-conductor[52020]: selections = self._select_destinations( [ 620.755687] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 620.755687] nova-conductor[52020]: selections = self._schedule( [ 620.755687] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 620.755687] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 620.755687] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 620.755687] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 620.755687] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 620.755687] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 620.755687] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-b0eec7d6-ef58-47fc-8e61-ba5f7e11fb19 tempest-ListImageFiltersTestJSON-2121114852 tempest-ListImageFiltersTestJSON-2121114852-project-member] [instance: 4fc66c3a-ce44-40b1-a7c5-a55ac6ae2e2e] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 621.122641] nova-conductor[52019]: ERROR nova.conductor.manager [None req-9d3d64e5-f623-45f7-94a2-0b7f4c481555 tempest-ServerPasswordTestJSON-218453344 tempest-ServerPasswordTestJSON-218453344-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 621.122641] nova-conductor[52019]: Traceback (most recent call last): [ 621.122641] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 621.122641] nova-conductor[52019]: return func(*args, **kwargs) [ 621.122641] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 621.122641] nova-conductor[52019]: selections = self._select_destinations( [ 621.122641] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 621.122641] nova-conductor[52019]: selections = self._schedule( [ 621.122641] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 621.122641] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 621.122641] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 621.122641] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 621.122641] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 621.122641] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 621.122641] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 621.122641] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 621.122641] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 621.122641] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 621.122641] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 621.122641] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 621.122641] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 621.123528] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 621.123528] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 621.123528] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 621.123528] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 621.123528] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 621.123528] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 621.123528] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 621.123528] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 621.123528] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 621.123528] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 621.123528] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 621.123528] nova-conductor[52019]: ERROR nova.conductor.manager [ 621.123528] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 621.123528] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 621.123528] nova-conductor[52019]: ERROR nova.conductor.manager [ 621.123528] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 621.123528] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 621.123528] nova-conductor[52019]: ERROR nova.conductor.manager [ 621.123528] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 621.123528] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 621.123528] nova-conductor[52019]: ERROR nova.conductor.manager [ 621.124200] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 621.124200] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 621.124200] nova-conductor[52019]: ERROR nova.conductor.manager [ 621.124200] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 621.124200] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 621.124200] nova-conductor[52019]: ERROR nova.conductor.manager [ 621.124200] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 621.124200] nova-conductor[52019]: ERROR nova.conductor.manager [ 621.124200] nova-conductor[52019]: ERROR nova.conductor.manager [ 621.127883] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9d3d64e5-f623-45f7-94a2-0b7f4c481555 tempest-ServerPasswordTestJSON-218453344 tempest-ServerPasswordTestJSON-218453344-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 621.128201] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9d3d64e5-f623-45f7-94a2-0b7f4c481555 tempest-ServerPasswordTestJSON-218453344 tempest-ServerPasswordTestJSON-218453344-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 621.128374] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9d3d64e5-f623-45f7-94a2-0b7f4c481555 tempest-ServerPasswordTestJSON-218453344 tempest-ServerPasswordTestJSON-218453344-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 621.175030] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-9d3d64e5-f623-45f7-94a2-0b7f4c481555 tempest-ServerPasswordTestJSON-218453344 tempest-ServerPasswordTestJSON-218453344-project-member] [instance: 21bae61a-3f84-4fc2-9775-d29843e16cda] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 621.175030] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9d3d64e5-f623-45f7-94a2-0b7f4c481555 tempest-ServerPasswordTestJSON-218453344 tempest-ServerPasswordTestJSON-218453344-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 621.175030] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9d3d64e5-f623-45f7-94a2-0b7f4c481555 tempest-ServerPasswordTestJSON-218453344 tempest-ServerPasswordTestJSON-218453344-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 621.175257] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9d3d64e5-f623-45f7-94a2-0b7f4c481555 tempest-ServerPasswordTestJSON-218453344 tempest-ServerPasswordTestJSON-218453344-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 621.177399] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-9d3d64e5-f623-45f7-94a2-0b7f4c481555 tempest-ServerPasswordTestJSON-218453344 tempest-ServerPasswordTestJSON-218453344-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 621.177399] nova-conductor[52019]: Traceback (most recent call last): [ 621.177399] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 621.177399] nova-conductor[52019]: return func(*args, **kwargs) [ 621.177399] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 621.177399] nova-conductor[52019]: selections = self._select_destinations( [ 621.177399] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 621.177399] nova-conductor[52019]: selections = self._schedule( [ 621.177399] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 621.177399] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 621.177399] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 621.177399] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 621.177399] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 621.177399] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 621.177913] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-9d3d64e5-f623-45f7-94a2-0b7f4c481555 tempest-ServerPasswordTestJSON-218453344 tempest-ServerPasswordTestJSON-218453344-project-member] [instance: 21bae61a-3f84-4fc2-9775-d29843e16cda] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 622.304326] nova-conductor[52020]: ERROR nova.conductor.manager [None req-7d6da4f6-bec8-4766-a3a4-63365cd64a35 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 622.304326] nova-conductor[52020]: Traceback (most recent call last): [ 622.304326] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 622.304326] nova-conductor[52020]: return func(*args, **kwargs) [ 622.304326] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 622.304326] nova-conductor[52020]: selections = self._select_destinations( [ 622.304326] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 622.304326] nova-conductor[52020]: selections = self._schedule( [ 622.304326] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 622.304326] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 622.304326] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 622.304326] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 622.304326] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 622.304326] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 622.304326] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 622.304326] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 622.304326] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 622.304326] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 622.304326] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 622.304326] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 622.304326] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 622.305082] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 622.305082] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 622.305082] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 622.305082] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 622.305082] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 622.305082] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 622.305082] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 622.305082] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 622.305082] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 622.305082] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 622.305082] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 622.305082] nova-conductor[52020]: ERROR nova.conductor.manager [ 622.305082] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 622.305082] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 622.305082] nova-conductor[52020]: ERROR nova.conductor.manager [ 622.305082] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 622.305082] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 622.305082] nova-conductor[52020]: ERROR nova.conductor.manager [ 622.305082] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 622.305082] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 622.305082] nova-conductor[52020]: ERROR nova.conductor.manager [ 622.305621] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 622.305621] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 622.305621] nova-conductor[52020]: ERROR nova.conductor.manager [ 622.305621] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 622.305621] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 622.305621] nova-conductor[52020]: ERROR nova.conductor.manager [ 622.305621] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 622.305621] nova-conductor[52020]: ERROR nova.conductor.manager [ 622.305621] nova-conductor[52020]: ERROR nova.conductor.manager [ 622.314716] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7d6da4f6-bec8-4766-a3a4-63365cd64a35 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.314716] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7d6da4f6-bec8-4766-a3a4-63365cd64a35 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.314716] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7d6da4f6-bec8-4766-a3a4-63365cd64a35 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 622.366107] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-7d6da4f6-bec8-4766-a3a4-63365cd64a35 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: e83917eb-196f-49bb-9f76-762efc6bfa4c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 622.366460] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7d6da4f6-bec8-4766-a3a4-63365cd64a35 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.366558] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7d6da4f6-bec8-4766-a3a4-63365cd64a35 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.367066] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7d6da4f6-bec8-4766-a3a4-63365cd64a35 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 622.369868] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-7d6da4f6-bec8-4766-a3a4-63365cd64a35 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 622.369868] nova-conductor[52020]: Traceback (most recent call last): [ 622.369868] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 622.369868] nova-conductor[52020]: return func(*args, **kwargs) [ 622.369868] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 622.369868] nova-conductor[52020]: selections = self._select_destinations( [ 622.369868] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 622.369868] nova-conductor[52020]: selections = self._schedule( [ 622.369868] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 622.369868] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 622.369868] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 622.369868] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 622.369868] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 622.369868] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 622.370417] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-7d6da4f6-bec8-4766-a3a4-63365cd64a35 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: e83917eb-196f-49bb-9f76-762efc6bfa4c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 623.156147] nova-conductor[52019]: ERROR nova.conductor.manager [None req-ceccb36b-f762-44a1-b13d-cd13decfbe18 tempest-ListImageFiltersTestJSON-2121114852 tempest-ListImageFiltersTestJSON-2121114852-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 623.156147] nova-conductor[52019]: Traceback (most recent call last): [ 623.156147] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 623.156147] nova-conductor[52019]: return func(*args, **kwargs) [ 623.156147] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 623.156147] nova-conductor[52019]: selections = self._select_destinations( [ 623.156147] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 623.156147] nova-conductor[52019]: selections = self._schedule( [ 623.156147] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 623.156147] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 623.156147] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 623.156147] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 623.156147] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 623.156147] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 623.156147] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 623.156147] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 623.156147] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 623.156147] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 623.156147] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 623.156147] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 623.156147] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 623.156918] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 623.156918] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 623.156918] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 623.156918] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 623.156918] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 623.156918] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 623.156918] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 623.156918] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 623.156918] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 623.156918] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 623.156918] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 623.156918] nova-conductor[52019]: ERROR nova.conductor.manager [ 623.156918] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 623.156918] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 623.156918] nova-conductor[52019]: ERROR nova.conductor.manager [ 623.156918] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 623.156918] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 623.156918] nova-conductor[52019]: ERROR nova.conductor.manager [ 623.156918] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 623.156918] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 623.156918] nova-conductor[52019]: ERROR nova.conductor.manager [ 623.157507] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 623.157507] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 623.157507] nova-conductor[52019]: ERROR nova.conductor.manager [ 623.157507] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 623.157507] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 623.157507] nova-conductor[52019]: ERROR nova.conductor.manager [ 623.157507] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 623.157507] nova-conductor[52019]: ERROR nova.conductor.manager [ 623.157507] nova-conductor[52019]: ERROR nova.conductor.manager [ 623.163269] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-ceccb36b-f762-44a1-b13d-cd13decfbe18 tempest-ListImageFiltersTestJSON-2121114852 tempest-ListImageFiltersTestJSON-2121114852-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 623.163548] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-ceccb36b-f762-44a1-b13d-cd13decfbe18 tempest-ListImageFiltersTestJSON-2121114852 tempest-ListImageFiltersTestJSON-2121114852-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 623.163943] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-ceccb36b-f762-44a1-b13d-cd13decfbe18 tempest-ListImageFiltersTestJSON-2121114852 tempest-ListImageFiltersTestJSON-2121114852-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 623.205606] nova-conductor[52020]: ERROR nova.conductor.manager [None req-f3947a42-070f-4fc6-985c-342fe6425bea tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 623.205606] nova-conductor[52020]: Traceback (most recent call last): [ 623.205606] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 623.205606] nova-conductor[52020]: return func(*args, **kwargs) [ 623.205606] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 623.205606] nova-conductor[52020]: selections = self._select_destinations( [ 623.205606] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 623.205606] nova-conductor[52020]: selections = self._schedule( [ 623.205606] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 623.205606] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 623.205606] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 623.205606] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 623.205606] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 623.205606] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 623.205606] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 623.205606] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 623.205606] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 623.205606] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 623.205606] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 623.205606] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 623.205606] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 623.206188] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 623.206188] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 623.206188] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 623.206188] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 623.206188] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 623.206188] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 623.206188] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 623.206188] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 623.206188] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 623.206188] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 623.206188] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 623.206188] nova-conductor[52020]: ERROR nova.conductor.manager [ 623.206188] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 623.206188] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 623.206188] nova-conductor[52020]: ERROR nova.conductor.manager [ 623.206188] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 623.206188] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 623.206188] nova-conductor[52020]: ERROR nova.conductor.manager [ 623.206188] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 623.206188] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 623.206188] nova-conductor[52020]: ERROR nova.conductor.manager [ 623.206760] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 623.206760] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 623.206760] nova-conductor[52020]: ERROR nova.conductor.manager [ 623.206760] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 623.206760] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 623.206760] nova-conductor[52020]: ERROR nova.conductor.manager [ 623.206760] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 623.206760] nova-conductor[52020]: ERROR nova.conductor.manager [ 623.206760] nova-conductor[52020]: ERROR nova.conductor.manager [ 623.216015] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-f3947a42-070f-4fc6-985c-342fe6425bea tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 623.216015] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-f3947a42-070f-4fc6-985c-342fe6425bea tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 623.216225] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-f3947a42-070f-4fc6-985c-342fe6425bea tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 623.227018] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-ceccb36b-f762-44a1-b13d-cd13decfbe18 tempest-ListImageFiltersTestJSON-2121114852 tempest-ListImageFiltersTestJSON-2121114852-project-member] [instance: 19283ac4-6955-40ee-9ce7-edec4bb5c2a7] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 623.227018] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-ceccb36b-f762-44a1-b13d-cd13decfbe18 tempest-ListImageFiltersTestJSON-2121114852 tempest-ListImageFiltersTestJSON-2121114852-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 623.227018] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-ceccb36b-f762-44a1-b13d-cd13decfbe18 tempest-ListImageFiltersTestJSON-2121114852 tempest-ListImageFiltersTestJSON-2121114852-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 623.227259] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-ceccb36b-f762-44a1-b13d-cd13decfbe18 tempest-ListImageFiltersTestJSON-2121114852 tempest-ListImageFiltersTestJSON-2121114852-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 623.230953] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-ceccb36b-f762-44a1-b13d-cd13decfbe18 tempest-ListImageFiltersTestJSON-2121114852 tempest-ListImageFiltersTestJSON-2121114852-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 623.230953] nova-conductor[52019]: Traceback (most recent call last): [ 623.230953] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 623.230953] nova-conductor[52019]: return func(*args, **kwargs) [ 623.230953] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 623.230953] nova-conductor[52019]: selections = self._select_destinations( [ 623.230953] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 623.230953] nova-conductor[52019]: selections = self._schedule( [ 623.230953] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 623.230953] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 623.230953] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 623.230953] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 623.230953] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 623.230953] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 623.232016] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-ceccb36b-f762-44a1-b13d-cd13decfbe18 tempest-ListImageFiltersTestJSON-2121114852 tempest-ListImageFiltersTestJSON-2121114852-project-member] [instance: 19283ac4-6955-40ee-9ce7-edec4bb5c2a7] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 623.281020] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-f3947a42-070f-4fc6-985c-342fe6425bea tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] [instance: 06762eff-38d6-462e-9794-d842af3d7f15] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 623.281020] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-f3947a42-070f-4fc6-985c-342fe6425bea tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 623.281020] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-f3947a42-070f-4fc6-985c-342fe6425bea tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 623.281507] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-f3947a42-070f-4fc6-985c-342fe6425bea tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 623.284558] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-f3947a42-070f-4fc6-985c-342fe6425bea tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 623.284558] nova-conductor[52020]: Traceback (most recent call last): [ 623.284558] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 623.284558] nova-conductor[52020]: return func(*args, **kwargs) [ 623.284558] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 623.284558] nova-conductor[52020]: selections = self._select_destinations( [ 623.284558] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 623.284558] nova-conductor[52020]: selections = self._schedule( [ 623.284558] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 623.284558] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 623.284558] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 623.284558] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 623.284558] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 623.284558] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 623.285493] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-f3947a42-070f-4fc6-985c-342fe6425bea tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] [instance: 06762eff-38d6-462e-9794-d842af3d7f15] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 625.471148] nova-conductor[52019]: ERROR nova.conductor.manager [None req-0dea5a8e-d6a1-4821-ac08-464f6436a9d8 tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 625.471148] nova-conductor[52019]: Traceback (most recent call last): [ 625.471148] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 625.471148] nova-conductor[52019]: return func(*args, **kwargs) [ 625.471148] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 625.471148] nova-conductor[52019]: selections = self._select_destinations( [ 625.471148] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 625.471148] nova-conductor[52019]: selections = self._schedule( [ 625.471148] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 625.471148] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 625.471148] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 625.471148] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 625.471148] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 625.471148] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 625.471148] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 625.471148] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 625.471148] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 625.471148] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 625.471148] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 625.471148] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 625.471148] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 625.472071] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 625.472071] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 625.472071] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 625.472071] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 625.472071] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 625.472071] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 625.472071] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 625.472071] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 625.472071] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 625.472071] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 625.472071] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 625.472071] nova-conductor[52019]: ERROR nova.conductor.manager [ 625.472071] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 625.472071] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 625.472071] nova-conductor[52019]: ERROR nova.conductor.manager [ 625.472071] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 625.472071] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 625.472071] nova-conductor[52019]: ERROR nova.conductor.manager [ 625.472071] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 625.472071] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 625.472071] nova-conductor[52019]: ERROR nova.conductor.manager [ 625.472688] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 625.472688] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 625.472688] nova-conductor[52019]: ERROR nova.conductor.manager [ 625.472688] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 625.472688] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 625.472688] nova-conductor[52019]: ERROR nova.conductor.manager [ 625.472688] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 625.472688] nova-conductor[52019]: ERROR nova.conductor.manager [ 625.472688] nova-conductor[52019]: ERROR nova.conductor.manager [ 625.477962] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-0dea5a8e-d6a1-4821-ac08-464f6436a9d8 tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.478218] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-0dea5a8e-d6a1-4821-ac08-464f6436a9d8 tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.478379] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-0dea5a8e-d6a1-4821-ac08-464f6436a9d8 tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.527243] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-0dea5a8e-d6a1-4821-ac08-464f6436a9d8 tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] [instance: 4fa6e54a-7bf9-4d2f-8255-a4829d11afdd] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 625.528713] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-0dea5a8e-d6a1-4821-ac08-464f6436a9d8 tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.528713] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-0dea5a8e-d6a1-4821-ac08-464f6436a9d8 tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.528713] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-0dea5a8e-d6a1-4821-ac08-464f6436a9d8 tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.533877] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-0dea5a8e-d6a1-4821-ac08-464f6436a9d8 tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 625.533877] nova-conductor[52019]: Traceback (most recent call last): [ 625.533877] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 625.533877] nova-conductor[52019]: return func(*args, **kwargs) [ 625.533877] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 625.533877] nova-conductor[52019]: selections = self._select_destinations( [ 625.533877] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 625.533877] nova-conductor[52019]: selections = self._schedule( [ 625.533877] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 625.533877] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 625.533877] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 625.533877] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 625.533877] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 625.533877] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 625.537975] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-0dea5a8e-d6a1-4821-ac08-464f6436a9d8 tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] [instance: 4fa6e54a-7bf9-4d2f-8255-a4829d11afdd] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 627.809204] nova-conductor[52020]: ERROR nova.conductor.manager [None req-99e8ec92-c08d-429a-b60e-0b7011d6f382 tempest-InstanceActionsTestJSON-128812469 tempest-InstanceActionsTestJSON-128812469-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 627.809204] nova-conductor[52020]: Traceback (most recent call last): [ 627.809204] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 627.809204] nova-conductor[52020]: return func(*args, **kwargs) [ 627.809204] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 627.809204] nova-conductor[52020]: selections = self._select_destinations( [ 627.809204] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 627.809204] nova-conductor[52020]: selections = self._schedule( [ 627.809204] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 627.809204] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 627.809204] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 627.809204] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 627.809204] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 627.809204] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 627.809204] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 627.809204] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 627.809204] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 627.809204] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 627.809204] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 627.809204] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 627.809204] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 627.810042] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 627.810042] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 627.810042] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 627.810042] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 627.810042] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 627.810042] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 627.810042] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 627.810042] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 627.810042] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 627.810042] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 627.810042] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 627.810042] nova-conductor[52020]: ERROR nova.conductor.manager [ 627.810042] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 627.810042] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 627.810042] nova-conductor[52020]: ERROR nova.conductor.manager [ 627.810042] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 627.810042] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 627.810042] nova-conductor[52020]: ERROR nova.conductor.manager [ 627.810042] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 627.810042] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 627.810042] nova-conductor[52020]: ERROR nova.conductor.manager [ 627.810697] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 627.810697] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 627.810697] nova-conductor[52020]: ERROR nova.conductor.manager [ 627.810697] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 627.810697] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 627.810697] nova-conductor[52020]: ERROR nova.conductor.manager [ 627.810697] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 627.810697] nova-conductor[52020]: ERROR nova.conductor.manager [ 627.810697] nova-conductor[52020]: ERROR nova.conductor.manager [ 627.822651] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-99e8ec92-c08d-429a-b60e-0b7011d6f382 tempest-InstanceActionsTestJSON-128812469 tempest-InstanceActionsTestJSON-128812469-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.822651] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-99e8ec92-c08d-429a-b60e-0b7011d6f382 tempest-InstanceActionsTestJSON-128812469 tempest-InstanceActionsTestJSON-128812469-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.822651] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-99e8ec92-c08d-429a-b60e-0b7011d6f382 tempest-InstanceActionsTestJSON-128812469 tempest-InstanceActionsTestJSON-128812469-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.840449] nova-conductor[52019]: ERROR nova.conductor.manager [None req-8b9af34f-00a8-482b-8076-e963649838db tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 627.840449] nova-conductor[52019]: Traceback (most recent call last): [ 627.840449] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 627.840449] nova-conductor[52019]: return func(*args, **kwargs) [ 627.840449] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 627.840449] nova-conductor[52019]: selections = self._select_destinations( [ 627.840449] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 627.840449] nova-conductor[52019]: selections = self._schedule( [ 627.840449] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 627.840449] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 627.840449] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 627.840449] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 627.840449] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 627.840449] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 627.840449] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 627.840449] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 627.840449] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 627.840449] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 627.840449] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 627.840449] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 627.840449] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 627.841277] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 627.841277] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 627.841277] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 627.841277] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 627.841277] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 627.841277] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 627.841277] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 627.841277] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 627.841277] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 627.841277] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 627.841277] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 627.841277] nova-conductor[52019]: ERROR nova.conductor.manager [ 627.841277] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 627.841277] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 627.841277] nova-conductor[52019]: ERROR nova.conductor.manager [ 627.841277] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 627.841277] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 627.841277] nova-conductor[52019]: ERROR nova.conductor.manager [ 627.841277] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 627.841277] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 627.841277] nova-conductor[52019]: ERROR nova.conductor.manager [ 627.841932] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 627.841932] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 627.841932] nova-conductor[52019]: ERROR nova.conductor.manager [ 627.841932] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 627.841932] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 627.841932] nova-conductor[52019]: ERROR nova.conductor.manager [ 627.841932] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 627.841932] nova-conductor[52019]: ERROR nova.conductor.manager [ 627.841932] nova-conductor[52019]: ERROR nova.conductor.manager [ 627.846921] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-8b9af34f-00a8-482b-8076-e963649838db tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.847224] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-8b9af34f-00a8-482b-8076-e963649838db tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.847831] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-8b9af34f-00a8-482b-8076-e963649838db tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.879059] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-99e8ec92-c08d-429a-b60e-0b7011d6f382 tempest-InstanceActionsTestJSON-128812469 tempest-InstanceActionsTestJSON-128812469-project-member] [instance: 0d3a33ba-9948-46b0-a780-05187b3b9e2a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 627.879059] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-99e8ec92-c08d-429a-b60e-0b7011d6f382 tempest-InstanceActionsTestJSON-128812469 tempest-InstanceActionsTestJSON-128812469-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.879059] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-99e8ec92-c08d-429a-b60e-0b7011d6f382 tempest-InstanceActionsTestJSON-128812469 tempest-InstanceActionsTestJSON-128812469-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.879283] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-99e8ec92-c08d-429a-b60e-0b7011d6f382 tempest-InstanceActionsTestJSON-128812469 tempest-InstanceActionsTestJSON-128812469-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.880709] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-99e8ec92-c08d-429a-b60e-0b7011d6f382 tempest-InstanceActionsTestJSON-128812469 tempest-InstanceActionsTestJSON-128812469-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 627.880709] nova-conductor[52020]: Traceback (most recent call last): [ 627.880709] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 627.880709] nova-conductor[52020]: return func(*args, **kwargs) [ 627.880709] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 627.880709] nova-conductor[52020]: selections = self._select_destinations( [ 627.880709] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 627.880709] nova-conductor[52020]: selections = self._schedule( [ 627.880709] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 627.880709] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 627.880709] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 627.880709] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 627.880709] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 627.880709] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 627.881813] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-99e8ec92-c08d-429a-b60e-0b7011d6f382 tempest-InstanceActionsTestJSON-128812469 tempest-InstanceActionsTestJSON-128812469-project-member] [instance: 0d3a33ba-9948-46b0-a780-05187b3b9e2a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 627.889140] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-8b9af34f-00a8-482b-8076-e963649838db tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] [instance: 53171b82-87bd-4747-8c44-f4b96944576b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 627.889872] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-8b9af34f-00a8-482b-8076-e963649838db tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.890096] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-8b9af34f-00a8-482b-8076-e963649838db tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.890261] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-8b9af34f-00a8-482b-8076-e963649838db tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.895590] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-8b9af34f-00a8-482b-8076-e963649838db tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 627.895590] nova-conductor[52019]: Traceback (most recent call last): [ 627.895590] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 627.895590] nova-conductor[52019]: return func(*args, **kwargs) [ 627.895590] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 627.895590] nova-conductor[52019]: selections = self._select_destinations( [ 627.895590] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 627.895590] nova-conductor[52019]: selections = self._schedule( [ 627.895590] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 627.895590] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 627.895590] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 627.895590] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 627.895590] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 627.895590] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 627.896102] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-8b9af34f-00a8-482b-8076-e963649838db tempest-ListServerFiltersTestJSON-319734708 tempest-ListServerFiltersTestJSON-319734708-project-member] [instance: 53171b82-87bd-4747-8c44-f4b96944576b] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 632.499017] nova-conductor[52020]: ERROR nova.conductor.manager [None req-5ecea07c-69c2-47e9-b1f7-b9b17254da86 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 632.499017] nova-conductor[52020]: Traceback (most recent call last): [ 632.499017] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 632.499017] nova-conductor[52020]: return func(*args, **kwargs) [ 632.499017] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 632.499017] nova-conductor[52020]: selections = self._select_destinations( [ 632.499017] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 632.499017] nova-conductor[52020]: selections = self._schedule( [ 632.499017] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 632.499017] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 632.499017] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 632.499017] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 632.499017] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 632.499017] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 632.499017] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 632.499017] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 632.499017] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 632.499017] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 632.499017] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 632.499017] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 632.499017] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 632.500315] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 632.500315] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 632.500315] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 632.500315] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 632.500315] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 632.500315] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 632.500315] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 632.500315] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 632.500315] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 632.500315] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 632.500315] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 632.500315] nova-conductor[52020]: ERROR nova.conductor.manager [ 632.500315] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 632.500315] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 632.500315] nova-conductor[52020]: ERROR nova.conductor.manager [ 632.500315] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 632.500315] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 632.500315] nova-conductor[52020]: ERROR nova.conductor.manager [ 632.500315] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 632.500315] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 632.500315] nova-conductor[52020]: ERROR nova.conductor.manager [ 632.501242] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 632.501242] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 632.501242] nova-conductor[52020]: ERROR nova.conductor.manager [ 632.501242] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 632.501242] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 632.501242] nova-conductor[52020]: ERROR nova.conductor.manager [ 632.501242] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 632.501242] nova-conductor[52020]: ERROR nova.conductor.manager [ 632.501242] nova-conductor[52020]: ERROR nova.conductor.manager [ 632.513321] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5ecea07c-69c2-47e9-b1f7-b9b17254da86 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 632.513321] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5ecea07c-69c2-47e9-b1f7-b9b17254da86 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 632.513466] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5ecea07c-69c2-47e9-b1f7-b9b17254da86 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 632.631232] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-5ecea07c-69c2-47e9-b1f7-b9b17254da86 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] [instance: 37ead82c-5149-4ca4-b6d1-1c6a9b7a1d8f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 632.632049] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5ecea07c-69c2-47e9-b1f7-b9b17254da86 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 632.632220] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5ecea07c-69c2-47e9-b1f7-b9b17254da86 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 632.632431] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5ecea07c-69c2-47e9-b1f7-b9b17254da86 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 632.637539] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-5ecea07c-69c2-47e9-b1f7-b9b17254da86 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 632.637539] nova-conductor[52020]: Traceback (most recent call last): [ 632.637539] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 632.637539] nova-conductor[52020]: return func(*args, **kwargs) [ 632.637539] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 632.637539] nova-conductor[52020]: selections = self._select_destinations( [ 632.637539] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 632.637539] nova-conductor[52020]: selections = self._schedule( [ 632.637539] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 632.637539] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 632.637539] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 632.637539] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 632.637539] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 632.637539] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 632.638280] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-5ecea07c-69c2-47e9-b1f7-b9b17254da86 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] [instance: 37ead82c-5149-4ca4-b6d1-1c6a9b7a1d8f] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 633.471682] nova-conductor[52019]: ERROR nova.conductor.manager [None req-8cbcf482-fe29-4c89-8d60-53e3f84ad15b tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 633.471682] nova-conductor[52019]: Traceback (most recent call last): [ 633.471682] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 633.471682] nova-conductor[52019]: return func(*args, **kwargs) [ 633.471682] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 633.471682] nova-conductor[52019]: selections = self._select_destinations( [ 633.471682] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 633.471682] nova-conductor[52019]: selections = self._schedule( [ 633.471682] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 633.471682] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 633.471682] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 633.471682] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 633.471682] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 633.471682] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 633.471682] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 633.471682] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 633.471682] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 633.471682] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 633.471682] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 633.471682] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 633.471682] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 633.472849] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 633.472849] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 633.472849] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 633.472849] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 633.472849] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 633.472849] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 633.472849] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 633.472849] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 633.472849] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 633.472849] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 633.472849] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 633.472849] nova-conductor[52019]: ERROR nova.conductor.manager [ 633.472849] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 633.472849] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 633.472849] nova-conductor[52019]: ERROR nova.conductor.manager [ 633.472849] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 633.472849] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 633.472849] nova-conductor[52019]: ERROR nova.conductor.manager [ 633.472849] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 633.472849] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 633.472849] nova-conductor[52019]: ERROR nova.conductor.manager [ 633.473881] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 633.473881] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 633.473881] nova-conductor[52019]: ERROR nova.conductor.manager [ 633.473881] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 633.473881] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 633.473881] nova-conductor[52019]: ERROR nova.conductor.manager [ 633.473881] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 633.473881] nova-conductor[52019]: ERROR nova.conductor.manager [ 633.473881] nova-conductor[52019]: ERROR nova.conductor.manager [ 633.486312] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-8cbcf482-fe29-4c89-8d60-53e3f84ad15b tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.486563] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-8cbcf482-fe29-4c89-8d60-53e3f84ad15b tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.486735] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-8cbcf482-fe29-4c89-8d60-53e3f84ad15b tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 633.541471] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-8cbcf482-fe29-4c89-8d60-53e3f84ad15b tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] [instance: 37202d88-1a7e-4269-8b42-6f37a277fde0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 633.542467] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-8cbcf482-fe29-4c89-8d60-53e3f84ad15b tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.542697] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-8cbcf482-fe29-4c89-8d60-53e3f84ad15b tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.543043] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-8cbcf482-fe29-4c89-8d60-53e3f84ad15b tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 633.547125] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-8cbcf482-fe29-4c89-8d60-53e3f84ad15b tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 633.547125] nova-conductor[52019]: Traceback (most recent call last): [ 633.547125] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 633.547125] nova-conductor[52019]: return func(*args, **kwargs) [ 633.547125] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 633.547125] nova-conductor[52019]: selections = self._select_destinations( [ 633.547125] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 633.547125] nova-conductor[52019]: selections = self._schedule( [ 633.547125] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 633.547125] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 633.547125] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 633.547125] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 633.547125] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 633.547125] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 633.547515] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-8cbcf482-fe29-4c89-8d60-53e3f84ad15b tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] [instance: 37202d88-1a7e-4269-8b42-6f37a277fde0] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 639.538029] nova-conductor[52020]: ERROR nova.conductor.manager [None req-4b1cc2bd-38ef-4d29-878c-e55666319fea tempest-ServerRescueTestJSONUnderV235-1848828535 tempest-ServerRescueTestJSONUnderV235-1848828535-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 639.538029] nova-conductor[52020]: Traceback (most recent call last): [ 639.538029] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 639.538029] nova-conductor[52020]: return func(*args, **kwargs) [ 639.538029] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 639.538029] nova-conductor[52020]: selections = self._select_destinations( [ 639.538029] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 639.538029] nova-conductor[52020]: selections = self._schedule( [ 639.538029] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 639.538029] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 639.538029] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 639.538029] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 639.538029] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 639.538029] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 639.538029] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 639.538029] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 639.538029] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 639.538029] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 639.538029] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 639.538029] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 639.540515] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 639.540515] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 639.540515] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 639.540515] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 639.540515] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 639.540515] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 639.540515] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 639.540515] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 639.540515] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 639.540515] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 639.540515] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 639.540515] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 639.540515] nova-conductor[52020]: ERROR nova.conductor.manager [ 639.540515] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 639.540515] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 639.540515] nova-conductor[52020]: ERROR nova.conductor.manager [ 639.540515] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 639.540515] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 639.540515] nova-conductor[52020]: ERROR nova.conductor.manager [ 639.542095] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 639.542095] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 639.542095] nova-conductor[52020]: ERROR nova.conductor.manager [ 639.542095] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 639.542095] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 639.542095] nova-conductor[52020]: ERROR nova.conductor.manager [ 639.542095] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 639.542095] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 639.542095] nova-conductor[52020]: ERROR nova.conductor.manager [ 639.542095] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 639.542095] nova-conductor[52020]: ERROR nova.conductor.manager [ 639.542095] nova-conductor[52020]: ERROR nova.conductor.manager [ 639.544950] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-4b1cc2bd-38ef-4d29-878c-e55666319fea tempest-ServerRescueTestJSONUnderV235-1848828535 tempest-ServerRescueTestJSONUnderV235-1848828535-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 639.545189] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-4b1cc2bd-38ef-4d29-878c-e55666319fea tempest-ServerRescueTestJSONUnderV235-1848828535 tempest-ServerRescueTestJSONUnderV235-1848828535-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 639.545359] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-4b1cc2bd-38ef-4d29-878c-e55666319fea tempest-ServerRescueTestJSONUnderV235-1848828535 tempest-ServerRescueTestJSONUnderV235-1848828535-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 639.591280] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-4b1cc2bd-38ef-4d29-878c-e55666319fea tempest-ServerRescueTestJSONUnderV235-1848828535 tempest-ServerRescueTestJSONUnderV235-1848828535-project-member] [instance: e65824b0-5754-4a40-bee8-cd91c606c230] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 639.591971] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-4b1cc2bd-38ef-4d29-878c-e55666319fea tempest-ServerRescueTestJSONUnderV235-1848828535 tempest-ServerRescueTestJSONUnderV235-1848828535-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 639.592246] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-4b1cc2bd-38ef-4d29-878c-e55666319fea tempest-ServerRescueTestJSONUnderV235-1848828535 tempest-ServerRescueTestJSONUnderV235-1848828535-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 639.592420] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-4b1cc2bd-38ef-4d29-878c-e55666319fea tempest-ServerRescueTestJSONUnderV235-1848828535 tempest-ServerRescueTestJSONUnderV235-1848828535-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 639.597716] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-4b1cc2bd-38ef-4d29-878c-e55666319fea tempest-ServerRescueTestJSONUnderV235-1848828535 tempest-ServerRescueTestJSONUnderV235-1848828535-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 639.597716] nova-conductor[52020]: Traceback (most recent call last): [ 639.597716] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 639.597716] nova-conductor[52020]: return func(*args, **kwargs) [ 639.597716] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 639.597716] nova-conductor[52020]: selections = self._select_destinations( [ 639.597716] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 639.597716] nova-conductor[52020]: selections = self._schedule( [ 639.597716] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 639.597716] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 639.597716] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 639.597716] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 639.597716] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 639.597716] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 639.597716] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-4b1cc2bd-38ef-4d29-878c-e55666319fea tempest-ServerRescueTestJSONUnderV235-1848828535 tempest-ServerRescueTestJSONUnderV235-1848828535-project-member] [instance: e65824b0-5754-4a40-bee8-cd91c606c230] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 641.518142] nova-conductor[52020]: ERROR nova.scheduler.utils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 624ec0e2-c230-4469-8ffe-047f914793b1 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 641.519025] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Rescheduling: True {{(pid=52020) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 641.519152] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 624ec0e2-c230-4469-8ffe-047f914793b1.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 624ec0e2-c230-4469-8ffe-047f914793b1. [ 641.523450] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 624ec0e2-c230-4469-8ffe-047f914793b1. [ 641.582521] nova-conductor[52020]: DEBUG nova.network.neutron [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] deallocate_for_instance() {{(pid=52020) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 641.712578] nova-conductor[52020]: DEBUG nova.network.neutron [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Instance cache missing network info. {{(pid=52020) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 641.715700] nova-conductor[52020]: DEBUG nova.network.neutron [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Updating instance_info_cache with network_info: [] {{(pid=52020) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager [None req-1546547c-32db-4c36-af81-049a458fa4b8 tempest-ServerAddressesTestJSON-29294824 tempest-ServerAddressesTestJSON-29294824-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 642.281348] nova-conductor[52020]: Traceback (most recent call last): [ 642.281348] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 642.281348] nova-conductor[52020]: return func(*args, **kwargs) [ 642.281348] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 642.281348] nova-conductor[52020]: selections = self._select_destinations( [ 642.281348] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 642.281348] nova-conductor[52020]: selections = self._schedule( [ 642.281348] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 642.281348] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 642.281348] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 642.281348] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 642.281348] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager [ 642.281348] nova-conductor[52020]: ERROR nova.conductor.manager [ 642.291816] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-1546547c-32db-4c36-af81-049a458fa4b8 tempest-ServerAddressesTestJSON-29294824 tempest-ServerAddressesTestJSON-29294824-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 642.292073] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-1546547c-32db-4c36-af81-049a458fa4b8 tempest-ServerAddressesTestJSON-29294824 tempest-ServerAddressesTestJSON-29294824-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 642.292382] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-1546547c-32db-4c36-af81-049a458fa4b8 tempest-ServerAddressesTestJSON-29294824 tempest-ServerAddressesTestJSON-29294824-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 642.342228] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-1546547c-32db-4c36-af81-049a458fa4b8 tempest-ServerAddressesTestJSON-29294824 tempest-ServerAddressesTestJSON-29294824-project-member] [instance: f48a7d41-da2f-455f-a69d-6a0fe01afbf1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 642.343030] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-1546547c-32db-4c36-af81-049a458fa4b8 tempest-ServerAddressesTestJSON-29294824 tempest-ServerAddressesTestJSON-29294824-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 642.343264] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-1546547c-32db-4c36-af81-049a458fa4b8 tempest-ServerAddressesTestJSON-29294824 tempest-ServerAddressesTestJSON-29294824-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 642.343470] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-1546547c-32db-4c36-af81-049a458fa4b8 tempest-ServerAddressesTestJSON-29294824 tempest-ServerAddressesTestJSON-29294824-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 642.347403] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-1546547c-32db-4c36-af81-049a458fa4b8 tempest-ServerAddressesTestJSON-29294824 tempest-ServerAddressesTestJSON-29294824-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 642.347403] nova-conductor[52020]: Traceback (most recent call last): [ 642.347403] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 642.347403] nova-conductor[52020]: return func(*args, **kwargs) [ 642.347403] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 642.347403] nova-conductor[52020]: selections = self._select_destinations( [ 642.347403] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 642.347403] nova-conductor[52020]: selections = self._schedule( [ 642.347403] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 642.347403] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 642.347403] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 642.347403] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 642.347403] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 642.347403] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 642.348153] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-1546547c-32db-4c36-af81-049a458fa4b8 tempest-ServerAddressesTestJSON-29294824 tempest-ServerAddressesTestJSON-29294824-project-member] [instance: f48a7d41-da2f-455f-a69d-6a0fe01afbf1] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager [None req-1d838124-5d21-4d49-88f0-f20c2841b536 tempest-AttachInterfacesUnderV243Test-898182292 tempest-AttachInterfacesUnderV243Test-898182292-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 644.751379] nova-conductor[52019]: Traceback (most recent call last): [ 644.751379] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 644.751379] nova-conductor[52019]: return func(*args, **kwargs) [ 644.751379] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 644.751379] nova-conductor[52019]: selections = self._select_destinations( [ 644.751379] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 644.751379] nova-conductor[52019]: selections = self._schedule( [ 644.751379] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 644.751379] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 644.751379] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 644.751379] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 644.751379] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager [ 644.751379] nova-conductor[52019]: ERROR nova.conductor.manager [ 644.763603] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1d838124-5d21-4d49-88f0-f20c2841b536 tempest-AttachInterfacesUnderV243Test-898182292 tempest-AttachInterfacesUnderV243Test-898182292-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.767122] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1d838124-5d21-4d49-88f0-f20c2841b536 tempest-AttachInterfacesUnderV243Test-898182292 tempest-AttachInterfacesUnderV243Test-898182292-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.767122] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1d838124-5d21-4d49-88f0-f20c2841b536 tempest-AttachInterfacesUnderV243Test-898182292 tempest-AttachInterfacesUnderV243Test-898182292-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 644.827474] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-1d838124-5d21-4d49-88f0-f20c2841b536 tempest-AttachInterfacesUnderV243Test-898182292 tempest-AttachInterfacesUnderV243Test-898182292-project-member] [instance: bb971e86-ccde-4b55-b293-9b29678fa4ba] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 644.827474] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1d838124-5d21-4d49-88f0-f20c2841b536 tempest-AttachInterfacesUnderV243Test-898182292 tempest-AttachInterfacesUnderV243Test-898182292-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.827474] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1d838124-5d21-4d49-88f0-f20c2841b536 tempest-AttachInterfacesUnderV243Test-898182292 tempest-AttachInterfacesUnderV243Test-898182292-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.827474] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1d838124-5d21-4d49-88f0-f20c2841b536 tempest-AttachInterfacesUnderV243Test-898182292 tempest-AttachInterfacesUnderV243Test-898182292-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 644.833014] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-1d838124-5d21-4d49-88f0-f20c2841b536 tempest-AttachInterfacesUnderV243Test-898182292 tempest-AttachInterfacesUnderV243Test-898182292-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 644.833014] nova-conductor[52019]: Traceback (most recent call last): [ 644.833014] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 644.833014] nova-conductor[52019]: return func(*args, **kwargs) [ 644.833014] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 644.833014] nova-conductor[52019]: selections = self._select_destinations( [ 644.833014] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 644.833014] nova-conductor[52019]: selections = self._schedule( [ 644.833014] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 644.833014] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 644.833014] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 644.833014] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 644.833014] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 644.833014] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 644.833014] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-1d838124-5d21-4d49-88f0-f20c2841b536 tempest-AttachInterfacesUnderV243Test-898182292 tempest-AttachInterfacesUnderV243Test-898182292-project-member] [instance: bb971e86-ccde-4b55-b293-9b29678fa4ba] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager [None req-3e8d2f54-c465-4cf3-9852-b5a8664a8840 tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 645.926679] nova-conductor[52020]: Traceback (most recent call last): [ 645.926679] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 645.926679] nova-conductor[52020]: return func(*args, **kwargs) [ 645.926679] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 645.926679] nova-conductor[52020]: selections = self._select_destinations( [ 645.926679] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 645.926679] nova-conductor[52020]: selections = self._schedule( [ 645.926679] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 645.926679] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 645.926679] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 645.926679] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 645.926679] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager [ 645.926679] nova-conductor[52020]: ERROR nova.conductor.manager [ 645.935303] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3e8d2f54-c465-4cf3-9852-b5a8664a8840 tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 645.935762] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3e8d2f54-c465-4cf3-9852-b5a8664a8840 tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 645.935987] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3e8d2f54-c465-4cf3-9852-b5a8664a8840 tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 646.000869] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-3e8d2f54-c465-4cf3-9852-b5a8664a8840 tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] [instance: 62340310-446b-4356-9509-d4b5824ff0e8] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 646.001800] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3e8d2f54-c465-4cf3-9852-b5a8664a8840 tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 646.002082] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3e8d2f54-c465-4cf3-9852-b5a8664a8840 tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 646.002316] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3e8d2f54-c465-4cf3-9852-b5a8664a8840 tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 646.008845] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-3e8d2f54-c465-4cf3-9852-b5a8664a8840 tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 646.008845] nova-conductor[52020]: Traceback (most recent call last): [ 646.008845] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 646.008845] nova-conductor[52020]: return func(*args, **kwargs) [ 646.008845] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 646.008845] nova-conductor[52020]: selections = self._select_destinations( [ 646.008845] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 646.008845] nova-conductor[52020]: selections = self._schedule( [ 646.008845] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 646.008845] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 646.008845] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 646.008845] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 646.008845] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 646.008845] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 646.009702] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-3e8d2f54-c465-4cf3-9852-b5a8664a8840 tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] [instance: 62340310-446b-4356-9509-d4b5824ff0e8] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager [None req-e44a1828-442d-4c83-aebf-9793ab6b956b tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 648.309245] nova-conductor[52019]: Traceback (most recent call last): [ 648.309245] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 648.309245] nova-conductor[52019]: return func(*args, **kwargs) [ 648.309245] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 648.309245] nova-conductor[52019]: selections = self._select_destinations( [ 648.309245] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 648.309245] nova-conductor[52019]: selections = self._schedule( [ 648.309245] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 648.309245] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 648.309245] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 648.309245] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 648.309245] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager [ 648.309245] nova-conductor[52019]: ERROR nova.conductor.manager [ 648.319887] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-e44a1828-442d-4c83-aebf-9793ab6b956b tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 648.320171] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-e44a1828-442d-4c83-aebf-9793ab6b956b tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 648.320348] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-e44a1828-442d-4c83-aebf-9793ab6b956b tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 648.402395] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-e44a1828-442d-4c83-aebf-9793ab6b956b tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: 4a646d2b-6443-42b4-93f6-cfe4d94b554d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 648.404650] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-e44a1828-442d-4c83-aebf-9793ab6b956b tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 648.404650] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-e44a1828-442d-4c83-aebf-9793ab6b956b tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 648.404650] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-e44a1828-442d-4c83-aebf-9793ab6b956b tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 648.411260] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-e44a1828-442d-4c83-aebf-9793ab6b956b tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 648.411260] nova-conductor[52019]: Traceback (most recent call last): [ 648.411260] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 648.411260] nova-conductor[52019]: return func(*args, **kwargs) [ 648.411260] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 648.411260] nova-conductor[52019]: selections = self._select_destinations( [ 648.411260] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 648.411260] nova-conductor[52019]: selections = self._schedule( [ 648.411260] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 648.411260] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 648.411260] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 648.411260] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 648.411260] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 648.411260] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 648.411798] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-e44a1828-442d-4c83-aebf-9793ab6b956b tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: 4a646d2b-6443-42b4-93f6-cfe4d94b554d] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager [None req-be2726bf-b255-408c-8a2b-c8a57c457253 tempest-ServersAaction247Test-2141225313 tempest-ServersAaction247Test-2141225313-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.514787] nova-conductor[52020]: Traceback (most recent call last): [ 650.514787] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.514787] nova-conductor[52020]: return func(*args, **kwargs) [ 650.514787] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.514787] nova-conductor[52020]: selections = self._select_destinations( [ 650.514787] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.514787] nova-conductor[52020]: selections = self._schedule( [ 650.514787] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.514787] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 650.514787] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.514787] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 650.514787] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager [ 650.514787] nova-conductor[52020]: ERROR nova.conductor.manager [ 650.526073] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-be2726bf-b255-408c-8a2b-c8a57c457253 tempest-ServersAaction247Test-2141225313 tempest-ServersAaction247Test-2141225313-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 650.526073] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-be2726bf-b255-408c-8a2b-c8a57c457253 tempest-ServersAaction247Test-2141225313 tempest-ServersAaction247Test-2141225313-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 650.526073] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-be2726bf-b255-408c-8a2b-c8a57c457253 tempest-ServersAaction247Test-2141225313 tempest-ServersAaction247Test-2141225313-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 650.592217] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-be2726bf-b255-408c-8a2b-c8a57c457253 tempest-ServersAaction247Test-2141225313 tempest-ServersAaction247Test-2141225313-project-member] [instance: e98d86a5-a014-4209-a820-1e32aff1e81f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 650.594797] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-be2726bf-b255-408c-8a2b-c8a57c457253 tempest-ServersAaction247Test-2141225313 tempest-ServersAaction247Test-2141225313-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 650.594797] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-be2726bf-b255-408c-8a2b-c8a57c457253 tempest-ServersAaction247Test-2141225313 tempest-ServersAaction247Test-2141225313-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 650.594797] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-be2726bf-b255-408c-8a2b-c8a57c457253 tempest-ServersAaction247Test-2141225313 tempest-ServersAaction247Test-2141225313-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 650.606595] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-be2726bf-b255-408c-8a2b-c8a57c457253 tempest-ServersAaction247Test-2141225313 tempest-ServersAaction247Test-2141225313-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 650.606595] nova-conductor[52020]: Traceback (most recent call last): [ 650.606595] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.606595] nova-conductor[52020]: return func(*args, **kwargs) [ 650.606595] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.606595] nova-conductor[52020]: selections = self._select_destinations( [ 650.606595] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.606595] nova-conductor[52020]: selections = self._schedule( [ 650.606595] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.606595] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 650.606595] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.606595] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 650.606595] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.606595] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.606595] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-be2726bf-b255-408c-8a2b-c8a57c457253 tempest-ServersAaction247Test-2141225313 tempest-ServersAaction247Test-2141225313-project-member] [instance: e98d86a5-a014-4209-a820-1e32aff1e81f] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager [None req-a9fc1222-d9a5-4897-8d3d-a647a1eed6e7 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 651.769859] nova-conductor[52019]: Traceback (most recent call last): [ 651.769859] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 651.769859] nova-conductor[52019]: return func(*args, **kwargs) [ 651.769859] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 651.769859] nova-conductor[52019]: selections = self._select_destinations( [ 651.769859] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 651.769859] nova-conductor[52019]: selections = self._schedule( [ 651.769859] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 651.769859] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 651.769859] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 651.769859] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 651.769859] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager [ 651.769859] nova-conductor[52019]: ERROR nova.conductor.manager [ 651.776520] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-a9fc1222-d9a5-4897-8d3d-a647a1eed6e7 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 651.776665] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-a9fc1222-d9a5-4897-8d3d-a647a1eed6e7 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 651.776961] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-a9fc1222-d9a5-4897-8d3d-a647a1eed6e7 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 651.853877] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-a9fc1222-d9a5-4897-8d3d-a647a1eed6e7 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] [instance: 13b7ceaa-dfa5-4a82-85ba-a8c0b112d3ab] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 651.857536] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-a9fc1222-d9a5-4897-8d3d-a647a1eed6e7 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 651.857811] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-a9fc1222-d9a5-4897-8d3d-a647a1eed6e7 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 651.857945] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-a9fc1222-d9a5-4897-8d3d-a647a1eed6e7 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 651.876259] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-a9fc1222-d9a5-4897-8d3d-a647a1eed6e7 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 651.876259] nova-conductor[52019]: Traceback (most recent call last): [ 651.876259] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 651.876259] nova-conductor[52019]: return func(*args, **kwargs) [ 651.876259] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 651.876259] nova-conductor[52019]: selections = self._select_destinations( [ 651.876259] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 651.876259] nova-conductor[52019]: selections = self._schedule( [ 651.876259] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 651.876259] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 651.876259] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 651.876259] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 651.876259] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 651.876259] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 651.876830] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-a9fc1222-d9a5-4897-8d3d-a647a1eed6e7 tempest-AttachInterfacesTestJSON-1140162678 tempest-AttachInterfacesTestJSON-1140162678-project-member] [instance: 13b7ceaa-dfa5-4a82-85ba-a8c0b112d3ab] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager [None req-367fe66d-373a-48e4-b444-5bb525ca396f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 656.390459] nova-conductor[52020]: Traceback (most recent call last): [ 656.390459] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 656.390459] nova-conductor[52020]: return func(*args, **kwargs) [ 656.390459] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 656.390459] nova-conductor[52020]: selections = self._select_destinations( [ 656.390459] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 656.390459] nova-conductor[52020]: selections = self._schedule( [ 656.390459] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 656.390459] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 656.390459] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 656.390459] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 656.390459] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager [ 656.390459] nova-conductor[52020]: ERROR nova.conductor.manager [ 656.403370] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-367fe66d-373a-48e4-b444-5bb525ca396f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 656.403732] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-367fe66d-373a-48e4-b444-5bb525ca396f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.003s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 656.403870] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-367fe66d-373a-48e4-b444-5bb525ca396f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 656.451507] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-367fe66d-373a-48e4-b444-5bb525ca396f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] [instance: 86c4ce9e-9408-4c8c-8169-6997ae7d6fed] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 656.452254] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-367fe66d-373a-48e4-b444-5bb525ca396f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 656.452330] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-367fe66d-373a-48e4-b444-5bb525ca396f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 656.452493] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-367fe66d-373a-48e4-b444-5bb525ca396f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 656.463055] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-367fe66d-373a-48e4-b444-5bb525ca396f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 656.463055] nova-conductor[52020]: Traceback (most recent call last): [ 656.463055] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 656.463055] nova-conductor[52020]: return func(*args, **kwargs) [ 656.463055] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 656.463055] nova-conductor[52020]: selections = self._select_destinations( [ 656.463055] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 656.463055] nova-conductor[52020]: selections = self._schedule( [ 656.463055] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 656.463055] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 656.463055] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 656.463055] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 656.463055] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 656.463055] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 656.463877] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-367fe66d-373a-48e4-b444-5bb525ca396f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] [instance: 86c4ce9e-9408-4c8c-8169-6997ae7d6fed] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager [None req-55cbad79-7796-4ccb-a030-24f23ef4fea6 tempest-AttachInterfacesV270Test-1647516756 tempest-AttachInterfacesV270Test-1647516756-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 658.878096] nova-conductor[52019]: Traceback (most recent call last): [ 658.878096] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 658.878096] nova-conductor[52019]: return func(*args, **kwargs) [ 658.878096] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 658.878096] nova-conductor[52019]: selections = self._select_destinations( [ 658.878096] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 658.878096] nova-conductor[52019]: selections = self._schedule( [ 658.878096] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 658.878096] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 658.878096] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 658.878096] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 658.878096] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager [ 658.878096] nova-conductor[52019]: ERROR nova.conductor.manager [ 658.891092] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-55cbad79-7796-4ccb-a030-24f23ef4fea6 tempest-AttachInterfacesV270Test-1647516756 tempest-AttachInterfacesV270Test-1647516756-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 658.891092] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-55cbad79-7796-4ccb-a030-24f23ef4fea6 tempest-AttachInterfacesV270Test-1647516756 tempest-AttachInterfacesV270Test-1647516756-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 658.891092] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-55cbad79-7796-4ccb-a030-24f23ef4fea6 tempest-AttachInterfacesV270Test-1647516756 tempest-AttachInterfacesV270Test-1647516756-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 659.009020] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-55cbad79-7796-4ccb-a030-24f23ef4fea6 tempest-AttachInterfacesV270Test-1647516756 tempest-AttachInterfacesV270Test-1647516756-project-member] [instance: 12935365-8280-4742-a407-906c86a31c36] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 659.009819] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-55cbad79-7796-4ccb-a030-24f23ef4fea6 tempest-AttachInterfacesV270Test-1647516756 tempest-AttachInterfacesV270Test-1647516756-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 659.010053] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-55cbad79-7796-4ccb-a030-24f23ef4fea6 tempest-AttachInterfacesV270Test-1647516756 tempest-AttachInterfacesV270Test-1647516756-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 659.010226] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-55cbad79-7796-4ccb-a030-24f23ef4fea6 tempest-AttachInterfacesV270Test-1647516756 tempest-AttachInterfacesV270Test-1647516756-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 659.014033] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-55cbad79-7796-4ccb-a030-24f23ef4fea6 tempest-AttachInterfacesV270Test-1647516756 tempest-AttachInterfacesV270Test-1647516756-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 659.014033] nova-conductor[52019]: Traceback (most recent call last): [ 659.014033] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 659.014033] nova-conductor[52019]: return func(*args, **kwargs) [ 659.014033] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 659.014033] nova-conductor[52019]: selections = self._select_destinations( [ 659.014033] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 659.014033] nova-conductor[52019]: selections = self._schedule( [ 659.014033] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 659.014033] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 659.014033] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 659.014033] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 659.014033] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 659.014033] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 659.015763] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-55cbad79-7796-4ccb-a030-24f23ef4fea6 tempest-AttachInterfacesV270Test-1647516756 tempest-AttachInterfacesV270Test-1647516756-project-member] [instance: 12935365-8280-4742-a407-906c86a31c36] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager [None req-8270dfcb-fe15-4309-a1bf-593722500bc3 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 660.659480] nova-conductor[52020]: Traceback (most recent call last): [ 660.659480] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 660.659480] nova-conductor[52020]: return func(*args, **kwargs) [ 660.659480] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 660.659480] nova-conductor[52020]: selections = self._select_destinations( [ 660.659480] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 660.659480] nova-conductor[52020]: selections = self._schedule( [ 660.659480] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 660.659480] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 660.659480] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 660.659480] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 660.659480] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager [ 660.659480] nova-conductor[52020]: ERROR nova.conductor.manager [ 660.672475] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-8270dfcb-fe15-4309-a1bf-593722500bc3 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 660.672610] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-8270dfcb-fe15-4309-a1bf-593722500bc3 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 660.672782] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-8270dfcb-fe15-4309-a1bf-593722500bc3 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 660.735884] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-8270dfcb-fe15-4309-a1bf-593722500bc3 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: 1eb64caf-bb75-4c74-86da-3bb2189c29af] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 660.736913] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-8270dfcb-fe15-4309-a1bf-593722500bc3 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 660.737162] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-8270dfcb-fe15-4309-a1bf-593722500bc3 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 660.737340] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-8270dfcb-fe15-4309-a1bf-593722500bc3 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 660.741479] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-8270dfcb-fe15-4309-a1bf-593722500bc3 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 660.741479] nova-conductor[52020]: Traceback (most recent call last): [ 660.741479] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 660.741479] nova-conductor[52020]: return func(*args, **kwargs) [ 660.741479] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 660.741479] nova-conductor[52020]: selections = self._select_destinations( [ 660.741479] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 660.741479] nova-conductor[52020]: selections = self._schedule( [ 660.741479] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 660.741479] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 660.741479] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 660.741479] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 660.741479] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 660.741479] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 660.742773] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-8270dfcb-fe15-4309-a1bf-593722500bc3 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: 1eb64caf-bb75-4c74-86da-3bb2189c29af] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager [None req-e259c3f3-4bc1-4d93-a0f2-8458944572bb tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 664.807415] nova-conductor[52019]: Traceback (most recent call last): [ 664.807415] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 664.807415] nova-conductor[52019]: return func(*args, **kwargs) [ 664.807415] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 664.807415] nova-conductor[52019]: selections = self._select_destinations( [ 664.807415] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 664.807415] nova-conductor[52019]: selections = self._schedule( [ 664.807415] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 664.807415] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 664.807415] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 664.807415] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 664.807415] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager [ 664.807415] nova-conductor[52019]: ERROR nova.conductor.manager [ 664.815441] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-e259c3f3-4bc1-4d93-a0f2-8458944572bb tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 664.815904] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-e259c3f3-4bc1-4d93-a0f2-8458944572bb tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 664.816162] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-e259c3f3-4bc1-4d93-a0f2-8458944572bb tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 664.867088] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-e259c3f3-4bc1-4d93-a0f2-8458944572bb tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 52b4364d-69ad-47ca-92f9-86300e2431b3] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 664.867836] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-e259c3f3-4bc1-4d93-a0f2-8458944572bb tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 664.868071] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-e259c3f3-4bc1-4d93-a0f2-8458944572bb tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 664.868241] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-e259c3f3-4bc1-4d93-a0f2-8458944572bb tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 664.875188] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-e259c3f3-4bc1-4d93-a0f2-8458944572bb tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 664.875188] nova-conductor[52019]: Traceback (most recent call last): [ 664.875188] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 664.875188] nova-conductor[52019]: return func(*args, **kwargs) [ 664.875188] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 664.875188] nova-conductor[52019]: selections = self._select_destinations( [ 664.875188] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 664.875188] nova-conductor[52019]: selections = self._schedule( [ 664.875188] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 664.875188] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 664.875188] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 664.875188] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 664.875188] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 664.875188] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 664.876396] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-e259c3f3-4bc1-4d93-a0f2-8458944572bb tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 52b4364d-69ad-47ca-92f9-86300e2431b3] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager [None req-ac80d378-68b3-49dc-8d56-9b48820a546a tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 665.022398] nova-conductor[52020]: Traceback (most recent call last): [ 665.022398] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 665.022398] nova-conductor[52020]: return func(*args, **kwargs) [ 665.022398] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 665.022398] nova-conductor[52020]: selections = self._select_destinations( [ 665.022398] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 665.022398] nova-conductor[52020]: selections = self._schedule( [ 665.022398] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 665.022398] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 665.022398] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 665.022398] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 665.022398] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager [ 665.022398] nova-conductor[52020]: ERROR nova.conductor.manager [ 665.029172] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-ac80d378-68b3-49dc-8d56-9b48820a546a tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 665.029382] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-ac80d378-68b3-49dc-8d56-9b48820a546a tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 665.029579] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-ac80d378-68b3-49dc-8d56-9b48820a546a tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 665.073211] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-ac80d378-68b3-49dc-8d56-9b48820a546a tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] [instance: 6b4f2e71-86e4-4076-b7ef-b7ec4793bfbf] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 665.073900] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-ac80d378-68b3-49dc-8d56-9b48820a546a tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 665.074127] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-ac80d378-68b3-49dc-8d56-9b48820a546a tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 665.074296] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-ac80d378-68b3-49dc-8d56-9b48820a546a tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 665.077196] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-ac80d378-68b3-49dc-8d56-9b48820a546a tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 665.077196] nova-conductor[52020]: Traceback (most recent call last): [ 665.077196] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 665.077196] nova-conductor[52020]: return func(*args, **kwargs) [ 665.077196] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 665.077196] nova-conductor[52020]: selections = self._select_destinations( [ 665.077196] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 665.077196] nova-conductor[52020]: selections = self._schedule( [ 665.077196] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 665.077196] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 665.077196] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 665.077196] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 665.077196] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 665.077196] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 665.077886] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-ac80d378-68b3-49dc-8d56-9b48820a546a tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] [instance: 6b4f2e71-86e4-4076-b7ef-b7ec4793bfbf] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager [None req-aa1a9109-4671-4f79-868e-3334c2e54dc4 tempest-ServerAddressesNegativeTestJSON-1475618111 tempest-ServerAddressesNegativeTestJSON-1475618111-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 665.919685] nova-conductor[52019]: Traceback (most recent call last): [ 665.919685] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 665.919685] nova-conductor[52019]: return func(*args, **kwargs) [ 665.919685] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 665.919685] nova-conductor[52019]: selections = self._select_destinations( [ 665.919685] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 665.919685] nova-conductor[52019]: selections = self._schedule( [ 665.919685] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 665.919685] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 665.919685] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 665.919685] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 665.919685] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager [ 665.919685] nova-conductor[52019]: ERROR nova.conductor.manager [ 665.927628] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-aa1a9109-4671-4f79-868e-3334c2e54dc4 tempest-ServerAddressesNegativeTestJSON-1475618111 tempest-ServerAddressesNegativeTestJSON-1475618111-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 665.927958] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-aa1a9109-4671-4f79-868e-3334c2e54dc4 tempest-ServerAddressesNegativeTestJSON-1475618111 tempest-ServerAddressesNegativeTestJSON-1475618111-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 665.928216] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-aa1a9109-4671-4f79-868e-3334c2e54dc4 tempest-ServerAddressesNegativeTestJSON-1475618111 tempest-ServerAddressesNegativeTestJSON-1475618111-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 665.986594] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-aa1a9109-4671-4f79-868e-3334c2e54dc4 tempest-ServerAddressesNegativeTestJSON-1475618111 tempest-ServerAddressesNegativeTestJSON-1475618111-project-member] [instance: 12d8f38e-2bb8-4f33-834a-7ac7370699b1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 665.987328] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-aa1a9109-4671-4f79-868e-3334c2e54dc4 tempest-ServerAddressesNegativeTestJSON-1475618111 tempest-ServerAddressesNegativeTestJSON-1475618111-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 665.988680] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-aa1a9109-4671-4f79-868e-3334c2e54dc4 tempest-ServerAddressesNegativeTestJSON-1475618111 tempest-ServerAddressesNegativeTestJSON-1475618111-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 665.988680] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-aa1a9109-4671-4f79-868e-3334c2e54dc4 tempest-ServerAddressesNegativeTestJSON-1475618111 tempest-ServerAddressesNegativeTestJSON-1475618111-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 665.992732] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-aa1a9109-4671-4f79-868e-3334c2e54dc4 tempest-ServerAddressesNegativeTestJSON-1475618111 tempest-ServerAddressesNegativeTestJSON-1475618111-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 665.992732] nova-conductor[52019]: Traceback (most recent call last): [ 665.992732] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 665.992732] nova-conductor[52019]: return func(*args, **kwargs) [ 665.992732] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 665.992732] nova-conductor[52019]: selections = self._select_destinations( [ 665.992732] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 665.992732] nova-conductor[52019]: selections = self._schedule( [ 665.992732] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 665.992732] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 665.992732] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 665.992732] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 665.992732] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 665.992732] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 665.993364] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-aa1a9109-4671-4f79-868e-3334c2e54dc4 tempest-ServerAddressesNegativeTestJSON-1475618111 tempest-ServerAddressesNegativeTestJSON-1475618111-project-member] [instance: 12d8f38e-2bb8-4f33-834a-7ac7370699b1] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 667.408910] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52019) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 667.424567] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.424852] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.425097] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 667.462019] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.462019] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.462019] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 667.462019] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.462019] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.462019] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 667.469709] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52020) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 667.471230] nova-conductor[52019]: DEBUG nova.quota [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Getting quotas for project 171afaa5f3e84fce99d714d965673aab. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 667.473481] nova-conductor[52019]: DEBUG nova.quota [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Getting quotas for user e326832f5f0244e28c495002df50b11d and project 171afaa5f3e84fce99d714d965673aab. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 667.479268] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52019) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 667.480597] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.481306] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.481306] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 667.482665] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.482899] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.483159] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 667.486707] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 667.487400] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.487605] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.487779] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 667.501875] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.502430] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.502902] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 667.512833] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.513073] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.513244] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 667.514925] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.515206] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.515385] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 667.527065] nova-conductor[52020]: DEBUG nova.quota [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Getting quotas for project b2f7e4ed23c24eaf9e9b0300e9b8b2bf. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 667.529131] nova-conductor[52020]: DEBUG nova.quota [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Getting quotas for user c75afbe4a97240c49aaed07125b26a5e and project b2f7e4ed23c24eaf9e9b0300e9b8b2bf. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 667.536232] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52020) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 667.536748] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.536954] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.537169] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 667.542022] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 667.542214] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.542381] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.542552] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 667.556540] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.556753] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.556924] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.674020] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Took 0.17 seconds to select destinations for 1 instance(s). {{(pid=52019) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 668.690929] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.691208] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.692120] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.765570] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.765788] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.765951] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.766317] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.766496] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.766647] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.777293] nova-conductor[52019]: DEBUG nova.quota [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Getting quotas for project 188cc585bbcd41899980076b5c302bd1. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 668.783328] nova-conductor[52019]: DEBUG nova.quota [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Getting quotas for user 6ad48be56706489fb716357da3ba96b1 and project 188cc585bbcd41899980076b5c302bd1. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 668.796416] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52019) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 668.796944] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.797173] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.797374] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.805471] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 668.806204] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.806457] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.806547] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.820511] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.820732] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.820906] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 670.650652] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52020) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 670.669318] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 670.669382] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 670.670302] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 670.725022] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 670.725193] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 670.725292] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 670.725648] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 670.725827] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 670.725980] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 670.737718] nova-conductor[52020]: DEBUG nova.quota [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Getting quotas for project 6d2413e255144034ba23edb5eac6962a. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 670.743615] nova-conductor[52020]: DEBUG nova.quota [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Getting quotas for user 5df5c462491a49c3a5aedc630cd0bfac and project 6d2413e255144034ba23edb5eac6962a. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 670.746551] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52020) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 670.746917] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 670.747114] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 670.747281] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 670.750303] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] block_device_mapping [BlockDeviceMapping(attachment_id=16819a76-470e-4eaa-9cb4-5f5595eb70ed,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='volume',device_name=None,device_type=None,disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id=None,instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='volume',tag=None,updated_at=,uuid=,volume_id='26f0bfa0-6fa3-442a-8115-af5c9e8839d2',volume_size=1,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 670.750966] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 670.751182] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 670.751344] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 670.765084] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 670.765292] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 670.765465] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 671.477602] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52019) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 671.490804] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 671.491049] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 671.491330] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 671.523975] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 671.524316] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 671.524430] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 671.524759] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 671.525362] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 671.525362] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 671.534155] nova-conductor[52019]: DEBUG nova.quota [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Getting quotas for project cba7f3dcabc846a1b0b233e2a84f1a9a. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 671.536534] nova-conductor[52019]: DEBUG nova.quota [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Getting quotas for user ab5cca9aff134861993fc7050f446c23 and project cba7f3dcabc846a1b0b233e2a84f1a9a. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 671.542449] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52019) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 671.542939] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 671.543163] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 671.543331] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 671.551020] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 671.551020] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 671.551020] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 671.551020] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 671.566324] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 671.566428] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 671.566551] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 672.114353] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52020) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 672.129031] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 672.129253] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 672.129648] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 672.159880] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 672.160143] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 672.160318] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 672.160671] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 672.160880] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 672.161062] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 672.171327] nova-conductor[52020]: DEBUG nova.quota [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Getting quotas for project e3b401b63a9c430c97a0d087a98fe664. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 672.173608] nova-conductor[52020]: DEBUG nova.quota [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Getting quotas for user 9b124d362ff241668bb97f2dbfec39c7 and project e3b401b63a9c430c97a0d087a98fe664. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 672.179604] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52020) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 672.180201] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 672.180404] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 672.180732] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 672.184383] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 672.185063] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 672.185272] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 672.185440] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 672.199113] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 672.199343] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 672.199513] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 673.072427] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52019) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 673.086146] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 673.086568] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 673.086893] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 673.121604] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 673.121604] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 673.121604] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 673.121604] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 673.121604] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 673.121604] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 673.131924] nova-conductor[52019]: DEBUG nova.quota [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Getting quotas for project 0d64b8a4deb3494ab94a07334278cf23. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 673.134483] nova-conductor[52019]: DEBUG nova.quota [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Getting quotas for user 6e11af25776d40a59c3119ba4d8512fa and project 0d64b8a4deb3494ab94a07334278cf23. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 673.141138] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52019) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 673.141138] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 673.141897] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 673.142143] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 673.145124] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 673.145837] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 673.146095] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 673.146305] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 673.160967] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 673.162115] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 673.162115] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 674.199277] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52020) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 674.213557] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 674.213797] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 674.213979] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 674.243278] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 674.243495] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 674.243663] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 674.244016] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 674.244202] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 674.244365] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 674.252896] nova-conductor[52020]: DEBUG nova.quota [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Getting quotas for project 571ed5374d8840139a771d05ed89bd62. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 674.255333] nova-conductor[52020]: DEBUG nova.quota [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Getting quotas for user e4974158415f452aa156e7cf2682d2ca and project 571ed5374d8840139a771d05ed89bd62. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 674.261105] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] [instance: 5df12084-5dd6-41d1-9743-747f17ce3323] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52020) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 674.261576] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 674.261784] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 674.261952] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 674.264751] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] [instance: 5df12084-5dd6-41d1-9743-747f17ce3323] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 674.265440] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 674.265639] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 674.265805] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 674.277930] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 674.278152] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 674.278314] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 675.660151] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 675.660401] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 675.660575] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 675.796516] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Took 0.12 seconds to select destinations for 1 instance(s). {{(pid=52019) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 675.808088] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 675.808317] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 675.808484] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 675.836579] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 675.836795] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 675.836965] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 675.837378] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 675.837566] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 675.837722] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 675.845658] nova-conductor[52019]: DEBUG nova.quota [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Getting quotas for project 5e2f3f3508fd4d48820179768949f308. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 675.848085] nova-conductor[52019]: DEBUG nova.quota [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Getting quotas for user 171203040b3145889334abf806833bb7 and project 5e2f3f3508fd4d48820179768949f308. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 675.853706] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] [instance: 2ed6496a-3e75-4cfd-88da-9e0b731f738a] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52019) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 675.854158] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 675.854333] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 675.854498] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 675.857616] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] [instance: 2ed6496a-3e75-4cfd-88da-9e0b731f738a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 675.858258] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 675.858455] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 675.858619] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 675.870343] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 675.870549] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 675.870713] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 676.248992] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Took 0.12 seconds to select destinations for 1 instance(s). {{(pid=52020) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 676.260936] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 676.261199] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 676.261372] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 676.287840] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 676.288193] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 676.288493] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 676.288920] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 676.289210] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 676.289382] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 676.298360] nova-conductor[52020]: DEBUG nova.quota [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Getting quotas for project 6ad68b5676f0436399551ff0578f0b0c. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 676.300738] nova-conductor[52020]: DEBUG nova.quota [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Getting quotas for user 573dd1f6e0f24f409336e7da33929f2e and project 6ad68b5676f0436399551ff0578f0b0c. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 676.306550] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] [instance: 49d76773-e163-440b-aa99-08c379155149] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52020) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 676.307024] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 676.307228] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 676.307532] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 676.310900] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] [instance: 49d76773-e163-440b-aa99-08c379155149] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 676.310900] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 676.311091] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 676.311237] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 676.327988] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 676.328122] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 676.328285] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 681.361564] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Took 0.17 seconds to select destinations for 1 instance(s). {{(pid=52019) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 681.385798] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.386046] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.386230] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 681.419858] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.420116] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.420290] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 681.420637] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.420819] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.420981] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 681.429019] nova-conductor[52019]: DEBUG nova.quota [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Getting quotas for project cfe5778e2a4847eea40747480dff93d0. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 681.431419] nova-conductor[52019]: DEBUG nova.quota [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Getting quotas for user b9900836f93a4994b94c8544be6a16d5 and project cfe5778e2a4847eea40747480dff93d0. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 681.439150] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] [instance: dac8465a-592f-461c-af5b-49369eed5e70] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52019) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 681.439785] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.440049] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.440218] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 681.444641] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] [instance: dac8465a-592f-461c-af5b-49369eed5e70] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 681.445305] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.445498] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.445657] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 681.460591] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.460823] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.461064] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 681.788542] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52020) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 681.802734] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.802734] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.802734] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 681.839250] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.839478] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.839647] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 681.840116] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.840309] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.840469] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 681.850486] nova-conductor[52020]: DEBUG nova.quota [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Getting quotas for project 96ade1b38d154e2ba8f04520e1476c12. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 681.852921] nova-conductor[52020]: DEBUG nova.quota [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Getting quotas for user d600e9a87a0844f991b96545e6d08b06 and project 96ade1b38d154e2ba8f04520e1476c12. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 681.859382] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] [instance: 54605814-fdf4-43c7-9316-0d2594cdb5fa] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52020) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 681.859969] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.860173] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.860340] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 681.865854] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] [instance: 54605814-fdf4-43c7-9316-0d2594cdb5fa] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 681.866556] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.866766] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.867112] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 681.880285] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.880500] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.880672] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.482043] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Took 0.12 seconds to select destinations for 1 instance(s). {{(pid=52019) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 682.493670] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.493670] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.493670] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.543985] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.544238] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.544412] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.544768] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.544949] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.545128] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.555749] nova-conductor[52019]: DEBUG nova.quota [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Getting quotas for project de8d0a8b1b0e47a1976f8eb2a7f448de. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 682.558414] nova-conductor[52019]: DEBUG nova.quota [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Getting quotas for user 7399c9d4d40b4f33bd195739410db787 and project de8d0a8b1b0e47a1976f8eb2a7f448de. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 682.565080] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] [instance: f196648e-0e82-4a01-91fc-af1ba61f0490] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52019) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 682.565563] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.565768] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.565932] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.572244] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] [instance: f196648e-0e82-4a01-91fc-af1ba61f0490] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 682.573174] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.573174] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.573348] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.587563] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.587788] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.588035] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.650544] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52020) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 682.666587] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.666587] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.666812] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.696058] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.696410] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.696780] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.697238] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.697535] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.697823] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.706034] nova-conductor[52020]: DEBUG nova.quota [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Getting quotas for project 39d35f1952514f8faa2f19c49906a008. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 682.708515] nova-conductor[52020]: DEBUG nova.quota [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Getting quotas for user 4ecaad0c441d48edafda5162412539f9 and project 39d35f1952514f8faa2f19c49906a008. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 682.715254] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] [instance: 66420486-d25e-457d-94cd-6f96fca2df7d] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52020) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 682.715877] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.716227] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.716496] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.721681] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] [instance: 66420486-d25e-457d-94cd-6f96fca2df7d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 682.722764] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.725017] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.725017] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.736448] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.736877] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.737235] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 683.056035] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Took 0.12 seconds to select destinations for 1 instance(s). {{(pid=52019) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 683.068689] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.068870] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.069055] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 683.095465] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.095736] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.095860] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 683.096221] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.096408] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.096567] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 683.105648] nova-conductor[52019]: DEBUG nova.quota [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Getting quotas for project 96ade1b38d154e2ba8f04520e1476c12. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 683.110082] nova-conductor[52019]: DEBUG nova.quota [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Getting quotas for user d600e9a87a0844f991b96545e6d08b06 and project 96ade1b38d154e2ba8f04520e1476c12. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 683.117461] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] [instance: a1dd5ca8-4210-4950-8a6e-a7e05b9a38a0] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52019) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 683.117461] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.117461] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.117461] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 683.121355] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] [instance: a1dd5ca8-4210-4950-8a6e-a7e05b9a38a0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 683.122040] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.122176] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.122335] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 683.134889] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.135101] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.135267] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.830277] nova-conductor[52019]: ERROR nova.scheduler.utils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance ccaea0a9-59d6-456a-9885-2b90abf30abb was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 690.832031] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Rescheduling: True {{(pid=52019) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 690.832633] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance ccaea0a9-59d6-456a-9885-2b90abf30abb.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance ccaea0a9-59d6-456a-9885-2b90abf30abb. [ 690.833289] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance ccaea0a9-59d6-456a-9885-2b90abf30abb. [ 690.882194] nova-conductor[52019]: DEBUG nova.network.neutron [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] deallocate_for_instance() {{(pid=52019) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 691.002440] nova-conductor[52019]: DEBUG nova.network.neutron [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Instance cache missing network info. {{(pid=52019) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 691.007355] nova-conductor[52019]: DEBUG nova.network.neutron [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Updating instance_info_cache with network_info: [] {{(pid=52019) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 716.550820] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Took 0.11 seconds to select destinations for 1 instance(s). {{(pid=52020) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 716.562182] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.562417] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.562591] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.591323] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.591554] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.591726] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.592095] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.592285] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.592441] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.600916] nova-conductor[52020]: DEBUG nova.quota [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Getting quotas for project 396b40d8809545dc8eeb0fc355cfcc58. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 716.604051] nova-conductor[52020]: DEBUG nova.quota [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Getting quotas for user 046a5a655c774306b2680b89927ba285 and project 396b40d8809545dc8eeb0fc355cfcc58. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 716.608848] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52020) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 716.609327] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.609544] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.609706] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.612533] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] block_device_mapping [BlockDeviceMapping(attachment_id=02d65261-f106-4b85-8a32-25697f0b2252,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='volume',device_name=None,device_type=None,disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id=None,instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='volume',tag=None,updated_at=,uuid=,volume_id='9c19a3a5-ade9-4b1f-afd2-6e797690b152',volume_size=1,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 716.613917] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.613917] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.613917] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.633861] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.633861] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.633861] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.759683] nova-conductor[52019]: ERROR nova.scheduler.utils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance d74c7de4-5126-483f-9576-89e0007310b8 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 739.759683] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Rescheduling: True {{(pid=52019) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 739.759683] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d74c7de4-5126-483f-9576-89e0007310b8.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d74c7de4-5126-483f-9576-89e0007310b8. [ 739.759683] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d74c7de4-5126-483f-9576-89e0007310b8. [ 739.782558] nova-conductor[52019]: DEBUG nova.network.neutron [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] deallocate_for_instance() {{(pid=52019) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 739.799265] nova-conductor[52019]: DEBUG nova.network.neutron [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Instance cache missing network info. {{(pid=52019) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 739.802475] nova-conductor[52019]: DEBUG nova.network.neutron [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Updating instance_info_cache with network_info: [] {{(pid=52019) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager [None req-e930889b-76bf-4294-a8c1-138e3e327fd5 tempest-ServersAdminNegativeTestJSON-1657668231 tempest-ServersAdminNegativeTestJSON-1657668231-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 742.857024] nova-conductor[52019]: Traceback (most recent call last): [ 742.857024] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 742.857024] nova-conductor[52019]: return func(*args, **kwargs) [ 742.857024] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 742.857024] nova-conductor[52019]: selections = self._select_destinations( [ 742.857024] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 742.857024] nova-conductor[52019]: selections = self._schedule( [ 742.857024] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 742.857024] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 742.857024] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 742.857024] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 742.857024] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager [ 742.857024] nova-conductor[52019]: ERROR nova.conductor.manager [ 742.864844] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-e930889b-76bf-4294-a8c1-138e3e327fd5 tempest-ServersAdminNegativeTestJSON-1657668231 tempest-ServersAdminNegativeTestJSON-1657668231-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 742.865122] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-e930889b-76bf-4294-a8c1-138e3e327fd5 tempest-ServersAdminNegativeTestJSON-1657668231 tempest-ServersAdminNegativeTestJSON-1657668231-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 742.865347] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-e930889b-76bf-4294-a8c1-138e3e327fd5 tempest-ServersAdminNegativeTestJSON-1657668231 tempest-ServersAdminNegativeTestJSON-1657668231-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 742.915784] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-e930889b-76bf-4294-a8c1-138e3e327fd5 tempest-ServersAdminNegativeTestJSON-1657668231 tempest-ServersAdminNegativeTestJSON-1657668231-project-member] [instance: 69c3c034-7b7e-46fc-90f7-1d76d2db5c0d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 742.916640] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-e930889b-76bf-4294-a8c1-138e3e327fd5 tempest-ServersAdminNegativeTestJSON-1657668231 tempest-ServersAdminNegativeTestJSON-1657668231-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 742.916883] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-e930889b-76bf-4294-a8c1-138e3e327fd5 tempest-ServersAdminNegativeTestJSON-1657668231 tempest-ServersAdminNegativeTestJSON-1657668231-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 742.918263] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-e930889b-76bf-4294-a8c1-138e3e327fd5 tempest-ServersAdminNegativeTestJSON-1657668231 tempest-ServersAdminNegativeTestJSON-1657668231-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 742.920336] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-e930889b-76bf-4294-a8c1-138e3e327fd5 tempest-ServersAdminNegativeTestJSON-1657668231 tempest-ServersAdminNegativeTestJSON-1657668231-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 742.920336] nova-conductor[52019]: Traceback (most recent call last): [ 742.920336] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 742.920336] nova-conductor[52019]: return func(*args, **kwargs) [ 742.920336] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 742.920336] nova-conductor[52019]: selections = self._select_destinations( [ 742.920336] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 742.920336] nova-conductor[52019]: selections = self._schedule( [ 742.920336] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 742.920336] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 742.920336] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 742.920336] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 742.920336] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 742.920336] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 742.921401] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-e930889b-76bf-4294-a8c1-138e3e327fd5 tempest-ServersAdminNegativeTestJSON-1657668231 tempest-ServersAdminNegativeTestJSON-1657668231-project-member] [instance: 69c3c034-7b7e-46fc-90f7-1d76d2db5c0d] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 743.510430] nova-conductor[52020]: ERROR nova.scheduler.utils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 14e395c0-3650-40d6-82f1-1bd8f0b29984 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 743.513036] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Rescheduling: True {{(pid=52020) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 743.513036] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 14e395c0-3650-40d6-82f1-1bd8f0b29984.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 14e395c0-3650-40d6-82f1-1bd8f0b29984. [ 743.513036] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 14e395c0-3650-40d6-82f1-1bd8f0b29984. [ 743.536789] nova-conductor[52020]: DEBUG nova.network.neutron [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] deallocate_for_instance() {{(pid=52020) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 743.564179] nova-conductor[52020]: DEBUG nova.network.neutron [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Instance cache missing network info. {{(pid=52020) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 743.567450] nova-conductor[52020]: DEBUG nova.network.neutron [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Updating instance_info_cache with network_info: [] {{(pid=52020) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager [None req-e59bbb2e-8f2d-4c3a-b364-c252d07b11fe tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 746.757391] nova-conductor[52020]: Traceback (most recent call last): [ 746.757391] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 746.757391] nova-conductor[52020]: return func(*args, **kwargs) [ 746.757391] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 746.757391] nova-conductor[52020]: selections = self._select_destinations( [ 746.757391] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 746.757391] nova-conductor[52020]: selections = self._schedule( [ 746.757391] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 746.757391] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 746.757391] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 746.757391] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 746.757391] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager [ 746.757391] nova-conductor[52020]: ERROR nova.conductor.manager [ 746.766797] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e59bbb2e-8f2d-4c3a-b364-c252d07b11fe tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 746.766797] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e59bbb2e-8f2d-4c3a-b364-c252d07b11fe tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 746.766797] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e59bbb2e-8f2d-4c3a-b364-c252d07b11fe tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 746.830724] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-e59bbb2e-8f2d-4c3a-b364-c252d07b11fe tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 96dc2c7c-8455-44c7-8a7d-5d6d7903fdb2] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 746.832154] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e59bbb2e-8f2d-4c3a-b364-c252d07b11fe tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 746.832547] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e59bbb2e-8f2d-4c3a-b364-c252d07b11fe tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 746.832784] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-e59bbb2e-8f2d-4c3a-b364-c252d07b11fe tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 746.836605] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-e59bbb2e-8f2d-4c3a-b364-c252d07b11fe tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 746.836605] nova-conductor[52020]: Traceback (most recent call last): [ 746.836605] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 746.836605] nova-conductor[52020]: return func(*args, **kwargs) [ 746.836605] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 746.836605] nova-conductor[52020]: selections = self._select_destinations( [ 746.836605] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 746.836605] nova-conductor[52020]: selections = self._schedule( [ 746.836605] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 746.836605] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 746.836605] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 746.836605] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 746.836605] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 746.836605] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 746.837321] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-e59bbb2e-8f2d-4c3a-b364-c252d07b11fe tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 96dc2c7c-8455-44c7-8a7d-5d6d7903fdb2] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager [None req-355812aa-810a-4df8-98c5-0d85a0afde9c tempest-ServersTestMultiNic-1512351721 tempest-ServersTestMultiNic-1512351721-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 751.581423] nova-conductor[52020]: Traceback (most recent call last): [ 751.581423] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 751.581423] nova-conductor[52020]: return func(*args, **kwargs) [ 751.581423] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 751.581423] nova-conductor[52020]: selections = self._select_destinations( [ 751.581423] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 751.581423] nova-conductor[52020]: selections = self._schedule( [ 751.581423] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 751.581423] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 751.581423] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 751.581423] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 751.581423] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager [ 751.581423] nova-conductor[52020]: ERROR nova.conductor.manager [ 751.594199] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-355812aa-810a-4df8-98c5-0d85a0afde9c tempest-ServersTestMultiNic-1512351721 tempest-ServersTestMultiNic-1512351721-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 751.594199] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-355812aa-810a-4df8-98c5-0d85a0afde9c tempest-ServersTestMultiNic-1512351721 tempest-ServersTestMultiNic-1512351721-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 751.594199] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-355812aa-810a-4df8-98c5-0d85a0afde9c tempest-ServersTestMultiNic-1512351721 tempest-ServersTestMultiNic-1512351721-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 751.688023] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-355812aa-810a-4df8-98c5-0d85a0afde9c tempest-ServersTestMultiNic-1512351721 tempest-ServersTestMultiNic-1512351721-project-member] [instance: 2dd6864b-6980-4e92-9a3c-cfa94185f162] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 751.688023] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-355812aa-810a-4df8-98c5-0d85a0afde9c tempest-ServersTestMultiNic-1512351721 tempest-ServersTestMultiNic-1512351721-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 751.688023] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-355812aa-810a-4df8-98c5-0d85a0afde9c tempest-ServersTestMultiNic-1512351721 tempest-ServersTestMultiNic-1512351721-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 751.688493] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-355812aa-810a-4df8-98c5-0d85a0afde9c tempest-ServersTestMultiNic-1512351721 tempest-ServersTestMultiNic-1512351721-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 751.692303] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-355812aa-810a-4df8-98c5-0d85a0afde9c tempest-ServersTestMultiNic-1512351721 tempest-ServersTestMultiNic-1512351721-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 751.692303] nova-conductor[52020]: Traceback (most recent call last): [ 751.692303] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 751.692303] nova-conductor[52020]: return func(*args, **kwargs) [ 751.692303] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 751.692303] nova-conductor[52020]: selections = self._select_destinations( [ 751.692303] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 751.692303] nova-conductor[52020]: selections = self._schedule( [ 751.692303] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 751.692303] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 751.692303] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 751.692303] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 751.692303] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 751.692303] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 751.693825] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-355812aa-810a-4df8-98c5-0d85a0afde9c tempest-ServersTestMultiNic-1512351721 tempest-ServersTestMultiNic-1512351721-project-member] [instance: 2dd6864b-6980-4e92-9a3c-cfa94185f162] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager [None req-f4c8ed8b-41d9-4d1c-b4eb-534824188c4d tempest-ServersTestMultiNic-1512351721 tempest-ServersTestMultiNic-1512351721-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 755.996190] nova-conductor[52019]: Traceback (most recent call last): [ 755.996190] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 755.996190] nova-conductor[52019]: return func(*args, **kwargs) [ 755.996190] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 755.996190] nova-conductor[52019]: selections = self._select_destinations( [ 755.996190] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 755.996190] nova-conductor[52019]: selections = self._schedule( [ 755.996190] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 755.996190] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 755.996190] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 755.996190] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 755.996190] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager [ 755.996190] nova-conductor[52019]: ERROR nova.conductor.manager [ 756.004308] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f4c8ed8b-41d9-4d1c-b4eb-534824188c4d tempest-ServersTestMultiNic-1512351721 tempest-ServersTestMultiNic-1512351721-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 756.004623] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f4c8ed8b-41d9-4d1c-b4eb-534824188c4d tempest-ServersTestMultiNic-1512351721 tempest-ServersTestMultiNic-1512351721-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 756.004873] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f4c8ed8b-41d9-4d1c-b4eb-534824188c4d tempest-ServersTestMultiNic-1512351721 tempest-ServersTestMultiNic-1512351721-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 756.055584] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-f4c8ed8b-41d9-4d1c-b4eb-534824188c4d tempest-ServersTestMultiNic-1512351721 tempest-ServersTestMultiNic-1512351721-project-member] [instance: 5cb86085-958f-428a-95a5-2471b6576043] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 756.055584] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f4c8ed8b-41d9-4d1c-b4eb-534824188c4d tempest-ServersTestMultiNic-1512351721 tempest-ServersTestMultiNic-1512351721-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 756.055584] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f4c8ed8b-41d9-4d1c-b4eb-534824188c4d tempest-ServersTestMultiNic-1512351721 tempest-ServersTestMultiNic-1512351721-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 756.055584] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f4c8ed8b-41d9-4d1c-b4eb-534824188c4d tempest-ServersTestMultiNic-1512351721 tempest-ServersTestMultiNic-1512351721-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 756.062641] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-f4c8ed8b-41d9-4d1c-b4eb-534824188c4d tempest-ServersTestMultiNic-1512351721 tempest-ServersTestMultiNic-1512351721-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 756.062641] nova-conductor[52019]: Traceback (most recent call last): [ 756.062641] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 756.062641] nova-conductor[52019]: return func(*args, **kwargs) [ 756.062641] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 756.062641] nova-conductor[52019]: selections = self._select_destinations( [ 756.062641] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 756.062641] nova-conductor[52019]: selections = self._schedule( [ 756.062641] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 756.062641] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 756.062641] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 756.062641] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 756.062641] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 756.062641] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 756.062641] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-f4c8ed8b-41d9-4d1c-b4eb-534824188c4d tempest-ServersTestMultiNic-1512351721 tempest-ServersTestMultiNic-1512351721-project-member] [instance: 5cb86085-958f-428a-95a5-2471b6576043] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 788.966535] nova-conductor[52019]: DEBUG nova.db.main.api [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Created instance_extra for b9ffb5d9-8d56-4980-9e78-1e003cd56f7e {{(pid=52019) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 792.428013] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52020) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 792.446126] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 792.446358] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 792.446527] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 792.490534] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 792.490773] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 792.490942] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 792.491394] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 792.491591] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 792.491750] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 792.500016] nova-conductor[52020]: DEBUG nova.quota [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Getting quotas for project 60a64c047df340008256e691a618d959. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 792.503139] nova-conductor[52020]: DEBUG nova.quota [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Getting quotas for user f336cb51b0c249b09243817121e20c63 and project 60a64c047df340008256e691a618d959. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 792.508785] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52020) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 792.509226] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 792.509421] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 792.509584] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 792.512748] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 792.513374] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 792.513574] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 792.513739] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 792.526604] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 792.526604] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 792.526828] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 798.513511] nova-conductor[52019]: ERROR nova.scheduler.utils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 2342e3da-6d68-466a-9140-ced4eeda73d7 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 798.514057] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Rescheduling: True {{(pid=52019) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 798.514612] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 2342e3da-6d68-466a-9140-ced4eeda73d7.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 2342e3da-6d68-466a-9140-ced4eeda73d7. [ 798.514612] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 2342e3da-6d68-466a-9140-ced4eeda73d7. [ 798.537381] nova-conductor[52019]: DEBUG nova.network.neutron [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] deallocate_for_instance() {{(pid=52019) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 798.558665] nova-conductor[52019]: DEBUG nova.network.neutron [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Instance cache missing network info. {{(pid=52019) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 798.561925] nova-conductor[52019]: DEBUG nova.network.neutron [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Updating instance_info_cache with network_info: [] {{(pid=52019) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 822.351123] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Targeting cell 11743078-64bf-4468-a786-557c60808969(cell1) for conductor method rebuild_instance {{(pid=52019) wrapper /opt/stack/nova/nova/conductor/manager.py:95}} [ 822.351412] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 822.351570] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 822.351737] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 822.362615] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] No migration record for the rebuild/evacuate request. {{(pid=52019) rebuild_instance /opt/stack/nova/nova/conductor/manager.py:1227}} [ 836.669024] nova-conductor[52019]: ERROR nova.scheduler.utils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 294a5f91-9db2-4a43-8230-d3e6906c30f0 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 836.669423] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Rescheduling: True {{(pid=52019) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 836.669423] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 294a5f91-9db2-4a43-8230-d3e6906c30f0.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 294a5f91-9db2-4a43-8230-d3e6906c30f0. [ 836.669627] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 294a5f91-9db2-4a43-8230-d3e6906c30f0. [ 836.697985] nova-conductor[52019]: DEBUG nova.network.neutron [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] deallocate_for_instance() {{(pid=52019) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 836.714993] nova-conductor[52019]: DEBUG nova.network.neutron [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Instance cache missing network info. {{(pid=52019) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 836.717951] nova-conductor[52019]: DEBUG nova.network.neutron [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Updating instance_info_cache with network_info: [] {{(pid=52019) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 837.206095] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Took 0.11 seconds to select destinations for 1 instance(s). {{(pid=52020) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 837.226460] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 837.226687] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 837.226860] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 837.276966] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 837.277210] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 837.277375] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 837.277776] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 837.277949] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 837.278133] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 837.288174] nova-conductor[52020]: DEBUG nova.quota [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Getting quotas for project d239a4f0ed5b48cf9cd9a334de6f189c. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 837.293100] nova-conductor[52020]: DEBUG nova.quota [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Getting quotas for user 2e0c20ce66e045a5bfdffc27e037327e and project d239a4f0ed5b48cf9cd9a334de6f189c. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 837.300650] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52020) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 837.301146] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 837.301357] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 837.301568] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 837.304259] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 837.304932] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 837.305154] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 837.305320] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 837.317258] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 837.317463] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 837.317630] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.639168] nova-conductor[52020]: ERROR nova.scheduler.utils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 8264a1ad-cf20-404f-9d30-30c126e0c222 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 848.640098] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Rescheduling: True {{(pid=52020) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 848.640098] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 8264a1ad-cf20-404f-9d30-30c126e0c222.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 8264a1ad-cf20-404f-9d30-30c126e0c222. [ 848.640483] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 8264a1ad-cf20-404f-9d30-30c126e0c222. [ 848.662364] nova-conductor[52020]: DEBUG nova.network.neutron [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] deallocate_for_instance() {{(pid=52020) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 848.689386] nova-conductor[52020]: DEBUG nova.network.neutron [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Instance cache missing network info. {{(pid=52020) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 848.692946] nova-conductor[52020]: DEBUG nova.network.neutron [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Updating instance_info_cache with network_info: [] {{(pid=52020) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager [None req-2a69071b-0bc3-4235-a9d4-2b9762609ccd tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 870.193508] nova-conductor[52019]: Traceback (most recent call last): [ 870.193508] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 870.193508] nova-conductor[52019]: return func(*args, **kwargs) [ 870.193508] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 870.193508] nova-conductor[52019]: selections = self._select_destinations( [ 870.193508] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 870.193508] nova-conductor[52019]: selections = self._schedule( [ 870.193508] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 870.193508] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 870.193508] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 870.193508] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 870.193508] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager [ 870.193508] nova-conductor[52019]: ERROR nova.conductor.manager [ 870.201359] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-2a69071b-0bc3-4235-a9d4-2b9762609ccd tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 870.201669] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-2a69071b-0bc3-4235-a9d4-2b9762609ccd tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 870.201785] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-2a69071b-0bc3-4235-a9d4-2b9762609ccd tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 870.254442] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-2a69071b-0bc3-4235-a9d4-2b9762609ccd tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: 057d3204-a94c-40e3-b339-c5d2ff9ff0a1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 870.255211] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-2a69071b-0bc3-4235-a9d4-2b9762609ccd tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 870.257995] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-2a69071b-0bc3-4235-a9d4-2b9762609ccd tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 870.257995] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-2a69071b-0bc3-4235-a9d4-2b9762609ccd tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 870.258800] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-2a69071b-0bc3-4235-a9d4-2b9762609ccd tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 870.258800] nova-conductor[52019]: Traceback (most recent call last): [ 870.258800] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 870.258800] nova-conductor[52019]: return func(*args, **kwargs) [ 870.258800] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 870.258800] nova-conductor[52019]: selections = self._select_destinations( [ 870.258800] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 870.258800] nova-conductor[52019]: selections = self._schedule( [ 870.258800] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 870.258800] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 870.258800] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 870.258800] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 870.258800] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 870.258800] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 870.259289] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-2a69071b-0bc3-4235-a9d4-2b9762609ccd tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: 057d3204-a94c-40e3-b339-c5d2ff9ff0a1] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager [None req-23815266-8ae5-4908-a48b-5a4835689e9c tempest-ServersNegativeTestJSON-1931441335 tempest-ServersNegativeTestJSON-1931441335-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 871.962058] nova-conductor[52020]: Traceback (most recent call last): [ 871.962058] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 871.962058] nova-conductor[52020]: return func(*args, **kwargs) [ 871.962058] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 871.962058] nova-conductor[52020]: selections = self._select_destinations( [ 871.962058] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 871.962058] nova-conductor[52020]: selections = self._schedule( [ 871.962058] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 871.962058] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 871.962058] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 871.962058] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 871.962058] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager [ 871.962058] nova-conductor[52020]: ERROR nova.conductor.manager [ 871.971792] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-23815266-8ae5-4908-a48b-5a4835689e9c tempest-ServersNegativeTestJSON-1931441335 tempest-ServersNegativeTestJSON-1931441335-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 871.972052] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-23815266-8ae5-4908-a48b-5a4835689e9c tempest-ServersNegativeTestJSON-1931441335 tempest-ServersNegativeTestJSON-1931441335-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 871.972227] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-23815266-8ae5-4908-a48b-5a4835689e9c tempest-ServersNegativeTestJSON-1931441335 tempest-ServersNegativeTestJSON-1931441335-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 872.030777] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-23815266-8ae5-4908-a48b-5a4835689e9c tempest-ServersNegativeTestJSON-1931441335 tempest-ServersNegativeTestJSON-1931441335-project-member] [instance: 0bf69a57-7106-4ed7-a11b-cd47a68ae87a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 872.031422] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-23815266-8ae5-4908-a48b-5a4835689e9c tempest-ServersNegativeTestJSON-1931441335 tempest-ServersNegativeTestJSON-1931441335-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 872.031643] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-23815266-8ae5-4908-a48b-5a4835689e9c tempest-ServersNegativeTestJSON-1931441335 tempest-ServersNegativeTestJSON-1931441335-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 872.031813] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-23815266-8ae5-4908-a48b-5a4835689e9c tempest-ServersNegativeTestJSON-1931441335 tempest-ServersNegativeTestJSON-1931441335-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 872.037098] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-23815266-8ae5-4908-a48b-5a4835689e9c tempest-ServersNegativeTestJSON-1931441335 tempest-ServersNegativeTestJSON-1931441335-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 872.037098] nova-conductor[52020]: Traceback (most recent call last): [ 872.037098] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 872.037098] nova-conductor[52020]: return func(*args, **kwargs) [ 872.037098] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 872.037098] nova-conductor[52020]: selections = self._select_destinations( [ 872.037098] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 872.037098] nova-conductor[52020]: selections = self._schedule( [ 872.037098] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 872.037098] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 872.037098] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 872.037098] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 872.037098] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 872.037098] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 872.037098] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-23815266-8ae5-4908-a48b-5a4835689e9c tempest-ServersNegativeTestJSON-1931441335 tempest-ServersNegativeTestJSON-1931441335-project-member] [instance: 0bf69a57-7106-4ed7-a11b-cd47a68ae87a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager [None req-ac775eca-668f-47b5-b70f-e0649c59b0c9 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 874.926040] nova-conductor[52019]: Traceback (most recent call last): [ 874.926040] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 874.926040] nova-conductor[52019]: return func(*args, **kwargs) [ 874.926040] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 874.926040] nova-conductor[52019]: selections = self._select_destinations( [ 874.926040] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 874.926040] nova-conductor[52019]: selections = self._schedule( [ 874.926040] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 874.926040] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 874.926040] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 874.926040] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 874.926040] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager [ 874.926040] nova-conductor[52019]: ERROR nova.conductor.manager [ 874.942717] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-ac775eca-668f-47b5-b70f-e0649c59b0c9 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 874.942717] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-ac775eca-668f-47b5-b70f-e0649c59b0c9 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 874.942717] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-ac775eca-668f-47b5-b70f-e0649c59b0c9 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 874.986398] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-ac775eca-668f-47b5-b70f-e0649c59b0c9 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] [instance: 3f72d1f3-b87e-485c-83dd-bf702f3fb283] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 874.988998] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-ac775eca-668f-47b5-b70f-e0649c59b0c9 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 874.988998] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-ac775eca-668f-47b5-b70f-e0649c59b0c9 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 874.988998] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-ac775eca-668f-47b5-b70f-e0649c59b0c9 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 874.992743] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-ac775eca-668f-47b5-b70f-e0649c59b0c9 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 874.992743] nova-conductor[52019]: Traceback (most recent call last): [ 874.992743] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 874.992743] nova-conductor[52019]: return func(*args, **kwargs) [ 874.992743] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 874.992743] nova-conductor[52019]: selections = self._select_destinations( [ 874.992743] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 874.992743] nova-conductor[52019]: selections = self._schedule( [ 874.992743] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 874.992743] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 874.992743] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 874.992743] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 874.992743] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 874.992743] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 874.994253] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-ac775eca-668f-47b5-b70f-e0649c59b0c9 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] [instance: 3f72d1f3-b87e-485c-83dd-bf702f3fb283] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager [None req-f576727a-3907-4296-a638-fd043e41f513 tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 875.120586] nova-conductor[52020]: Traceback (most recent call last): [ 875.120586] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 875.120586] nova-conductor[52020]: return func(*args, **kwargs) [ 875.120586] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 875.120586] nova-conductor[52020]: selections = self._select_destinations( [ 875.120586] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 875.120586] nova-conductor[52020]: selections = self._schedule( [ 875.120586] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 875.120586] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 875.120586] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 875.120586] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 875.120586] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager [ 875.120586] nova-conductor[52020]: ERROR nova.conductor.manager [ 875.127830] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-f576727a-3907-4296-a638-fd043e41f513 tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 875.128124] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-f576727a-3907-4296-a638-fd043e41f513 tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 875.128318] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-f576727a-3907-4296-a638-fd043e41f513 tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 875.173165] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-f576727a-3907-4296-a638-fd043e41f513 tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] [instance: ab676438-ff9f-4cb7-8727-9153b124f967] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 875.173856] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-f576727a-3907-4296-a638-fd043e41f513 tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 875.174079] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-f576727a-3907-4296-a638-fd043e41f513 tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 875.174256] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-f576727a-3907-4296-a638-fd043e41f513 tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 875.179371] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-f576727a-3907-4296-a638-fd043e41f513 tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 875.179371] nova-conductor[52020]: Traceback (most recent call last): [ 875.179371] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 875.179371] nova-conductor[52020]: return func(*args, **kwargs) [ 875.179371] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 875.179371] nova-conductor[52020]: selections = self._select_destinations( [ 875.179371] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 875.179371] nova-conductor[52020]: selections = self._schedule( [ 875.179371] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 875.179371] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 875.179371] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 875.179371] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 875.179371] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 875.179371] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 875.179913] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-f576727a-3907-4296-a638-fd043e41f513 tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] [instance: ab676438-ff9f-4cb7-8727-9153b124f967] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager [None req-c56e569f-ca40-4057-8128-36443618548f tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 876.660622] nova-conductor[52019]: Traceback (most recent call last): [ 876.660622] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 876.660622] nova-conductor[52019]: return func(*args, **kwargs) [ 876.660622] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 876.660622] nova-conductor[52019]: selections = self._select_destinations( [ 876.660622] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 876.660622] nova-conductor[52019]: selections = self._schedule( [ 876.660622] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 876.660622] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 876.660622] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 876.660622] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 876.660622] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager [ 876.660622] nova-conductor[52019]: ERROR nova.conductor.manager [ 876.666871] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c56e569f-ca40-4057-8128-36443618548f tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 876.667108] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c56e569f-ca40-4057-8128-36443618548f tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 876.667356] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c56e569f-ca40-4057-8128-36443618548f tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 876.721251] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-c56e569f-ca40-4057-8128-36443618548f tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] [instance: ecdc3209-5525-46f5-a7ba-30b0bcb3b6c3] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 876.721974] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c56e569f-ca40-4057-8128-36443618548f tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 876.722215] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c56e569f-ca40-4057-8128-36443618548f tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 876.722383] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-c56e569f-ca40-4057-8128-36443618548f tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 876.726367] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-c56e569f-ca40-4057-8128-36443618548f tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 876.726367] nova-conductor[52019]: Traceback (most recent call last): [ 876.726367] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 876.726367] nova-conductor[52019]: return func(*args, **kwargs) [ 876.726367] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 876.726367] nova-conductor[52019]: selections = self._select_destinations( [ 876.726367] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 876.726367] nova-conductor[52019]: selections = self._schedule( [ 876.726367] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 876.726367] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 876.726367] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 876.726367] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 876.726367] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 876.726367] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 876.726830] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-c56e569f-ca40-4057-8128-36443618548f tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] [instance: ecdc3209-5525-46f5-a7ba-30b0bcb3b6c3] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager [None req-1df1330e-c5b7-4891-857d-0c50eb75a02f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 877.970736] nova-conductor[52020]: Traceback (most recent call last): [ 877.970736] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 877.970736] nova-conductor[52020]: return func(*args, **kwargs) [ 877.970736] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 877.970736] nova-conductor[52020]: selections = self._select_destinations( [ 877.970736] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 877.970736] nova-conductor[52020]: selections = self._schedule( [ 877.970736] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 877.970736] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 877.970736] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 877.970736] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 877.970736] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager [ 877.970736] nova-conductor[52020]: ERROR nova.conductor.manager [ 877.978510] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-1df1330e-c5b7-4891-857d-0c50eb75a02f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 877.979055] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-1df1330e-c5b7-4891-857d-0c50eb75a02f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 877.979055] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-1df1330e-c5b7-4891-857d-0c50eb75a02f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 878.038857] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-1df1330e-c5b7-4891-857d-0c50eb75a02f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] [instance: 06b19d37-a077-4239-94b6-34faf90399fe] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 878.039033] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-1df1330e-c5b7-4891-857d-0c50eb75a02f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 878.039259] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-1df1330e-c5b7-4891-857d-0c50eb75a02f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 878.039425] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-1df1330e-c5b7-4891-857d-0c50eb75a02f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 878.043760] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-1df1330e-c5b7-4891-857d-0c50eb75a02f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 878.043760] nova-conductor[52020]: Traceback (most recent call last): [ 878.043760] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 878.043760] nova-conductor[52020]: return func(*args, **kwargs) [ 878.043760] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 878.043760] nova-conductor[52020]: selections = self._select_destinations( [ 878.043760] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 878.043760] nova-conductor[52020]: selections = self._schedule( [ 878.043760] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 878.043760] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 878.043760] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 878.043760] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 878.043760] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 878.043760] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 878.044319] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-1df1330e-c5b7-4891-857d-0c50eb75a02f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] [instance: 06b19d37-a077-4239-94b6-34faf90399fe] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager [None req-0a75bf5d-2dd5-49f4-bcc1-0c0a3a51d4f0 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 879.903235] nova-conductor[52019]: Traceback (most recent call last): [ 879.903235] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 879.903235] nova-conductor[52019]: return func(*args, **kwargs) [ 879.903235] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 879.903235] nova-conductor[52019]: selections = self._select_destinations( [ 879.903235] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 879.903235] nova-conductor[52019]: selections = self._schedule( [ 879.903235] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 879.903235] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 879.903235] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 879.903235] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 879.903235] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager [ 879.903235] nova-conductor[52019]: ERROR nova.conductor.manager [ 879.913718] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-0a75bf5d-2dd5-49f4-bcc1-0c0a3a51d4f0 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 879.913959] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-0a75bf5d-2dd5-49f4-bcc1-0c0a3a51d4f0 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 879.914349] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-0a75bf5d-2dd5-49f4-bcc1-0c0a3a51d4f0 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 879.975343] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-0a75bf5d-2dd5-49f4-bcc1-0c0a3a51d4f0 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] [instance: 57be078d-9dae-4a47-8d4c-875d8650089e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 879.975343] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-0a75bf5d-2dd5-49f4-bcc1-0c0a3a51d4f0 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 879.975343] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-0a75bf5d-2dd5-49f4-bcc1-0c0a3a51d4f0 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 879.975527] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-0a75bf5d-2dd5-49f4-bcc1-0c0a3a51d4f0 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 879.978993] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-0a75bf5d-2dd5-49f4-bcc1-0c0a3a51d4f0 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 879.978993] nova-conductor[52019]: Traceback (most recent call last): [ 879.978993] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 879.978993] nova-conductor[52019]: return func(*args, **kwargs) [ 879.978993] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 879.978993] nova-conductor[52019]: selections = self._select_destinations( [ 879.978993] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 879.978993] nova-conductor[52019]: selections = self._schedule( [ 879.978993] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 879.978993] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 879.978993] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 879.978993] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 879.978993] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 879.978993] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 879.979609] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-0a75bf5d-2dd5-49f4-bcc1-0c0a3a51d4f0 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] [instance: 57be078d-9dae-4a47-8d4c-875d8650089e] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager [None req-7ec58ff8-4990-431e-85b7-7ec11d5567fd tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 881.026231] nova-conductor[52020]: Traceback (most recent call last): [ 881.026231] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 881.026231] nova-conductor[52020]: return func(*args, **kwargs) [ 881.026231] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 881.026231] nova-conductor[52020]: selections = self._select_destinations( [ 881.026231] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 881.026231] nova-conductor[52020]: selections = self._schedule( [ 881.026231] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 881.026231] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 881.026231] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 881.026231] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 881.026231] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager [ 881.026231] nova-conductor[52020]: ERROR nova.conductor.manager [ 881.032780] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7ec58ff8-4990-431e-85b7-7ec11d5567fd tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 881.033016] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7ec58ff8-4990-431e-85b7-7ec11d5567fd tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 881.033208] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7ec58ff8-4990-431e-85b7-7ec11d5567fd tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 881.092700] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-7ec58ff8-4990-431e-85b7-7ec11d5567fd tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] [instance: 619d69b4-4fd5-4d17-b3fe-50dc655150d0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 881.092850] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7ec58ff8-4990-431e-85b7-7ec11d5567fd tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 881.093045] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7ec58ff8-4990-431e-85b7-7ec11d5567fd tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 881.093336] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7ec58ff8-4990-431e-85b7-7ec11d5567fd tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 881.101398] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-7ec58ff8-4990-431e-85b7-7ec11d5567fd tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 881.101398] nova-conductor[52020]: Traceback (most recent call last): [ 881.101398] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 881.101398] nova-conductor[52020]: return func(*args, **kwargs) [ 881.101398] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 881.101398] nova-conductor[52020]: selections = self._select_destinations( [ 881.101398] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 881.101398] nova-conductor[52020]: selections = self._schedule( [ 881.101398] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 881.101398] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 881.101398] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 881.101398] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 881.101398] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 881.101398] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 881.101973] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-7ec58ff8-4990-431e-85b7-7ec11d5567fd tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] [instance: 619d69b4-4fd5-4d17-b3fe-50dc655150d0] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager [None req-9e65b975-6224-4a7f-99b6-95ff3a88be2a tempest-ServerMetadataTestJSON-1850876667 tempest-ServerMetadataTestJSON-1850876667-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 881.130222] nova-conductor[52019]: Traceback (most recent call last): [ 881.130222] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 881.130222] nova-conductor[52019]: return func(*args, **kwargs) [ 881.130222] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 881.130222] nova-conductor[52019]: selections = self._select_destinations( [ 881.130222] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 881.130222] nova-conductor[52019]: selections = self._schedule( [ 881.130222] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 881.130222] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 881.130222] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 881.130222] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 881.130222] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager [ 881.130222] nova-conductor[52019]: ERROR nova.conductor.manager [ 881.138148] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e65b975-6224-4a7f-99b6-95ff3a88be2a tempest-ServerMetadataTestJSON-1850876667 tempest-ServerMetadataTestJSON-1850876667-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 881.138255] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e65b975-6224-4a7f-99b6-95ff3a88be2a tempest-ServerMetadataTestJSON-1850876667 tempest-ServerMetadataTestJSON-1850876667-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 881.138357] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e65b975-6224-4a7f-99b6-95ff3a88be2a tempest-ServerMetadataTestJSON-1850876667 tempest-ServerMetadataTestJSON-1850876667-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 881.187897] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-9e65b975-6224-4a7f-99b6-95ff3a88be2a tempest-ServerMetadataTestJSON-1850876667 tempest-ServerMetadataTestJSON-1850876667-project-member] [instance: 5a28964c-26cb-43ef-8192-39d088e92fc2] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 881.188665] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e65b975-6224-4a7f-99b6-95ff3a88be2a tempest-ServerMetadataTestJSON-1850876667 tempest-ServerMetadataTestJSON-1850876667-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 881.189154] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e65b975-6224-4a7f-99b6-95ff3a88be2a tempest-ServerMetadataTestJSON-1850876667 tempest-ServerMetadataTestJSON-1850876667-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 881.189154] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e65b975-6224-4a7f-99b6-95ff3a88be2a tempest-ServerMetadataTestJSON-1850876667 tempest-ServerMetadataTestJSON-1850876667-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 881.195020] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-9e65b975-6224-4a7f-99b6-95ff3a88be2a tempest-ServerMetadataTestJSON-1850876667 tempest-ServerMetadataTestJSON-1850876667-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 881.195020] nova-conductor[52019]: Traceback (most recent call last): [ 881.195020] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 881.195020] nova-conductor[52019]: return func(*args, **kwargs) [ 881.195020] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 881.195020] nova-conductor[52019]: selections = self._select_destinations( [ 881.195020] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 881.195020] nova-conductor[52019]: selections = self._schedule( [ 881.195020] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 881.195020] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 881.195020] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 881.195020] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 881.195020] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 881.195020] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 881.195020] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-9e65b975-6224-4a7f-99b6-95ff3a88be2a tempest-ServerMetadataTestJSON-1850876667 tempest-ServerMetadataTestJSON-1850876667-project-member] [instance: 5a28964c-26cb-43ef-8192-39d088e92fc2] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager [None req-5c656e20-2806-41bc-bed1-a27b32a6263f tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 883.084860] nova-conductor[52020]: Traceback (most recent call last): [ 883.084860] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 883.084860] nova-conductor[52020]: return func(*args, **kwargs) [ 883.084860] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 883.084860] nova-conductor[52020]: selections = self._select_destinations( [ 883.084860] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 883.084860] nova-conductor[52020]: selections = self._schedule( [ 883.084860] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 883.084860] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 883.084860] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 883.084860] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 883.084860] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager [ 883.084860] nova-conductor[52020]: ERROR nova.conductor.manager [ 883.091760] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5c656e20-2806-41bc-bed1-a27b32a6263f tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 883.091985] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5c656e20-2806-41bc-bed1-a27b32a6263f tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 883.092169] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5c656e20-2806-41bc-bed1-a27b32a6263f tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 883.134089] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-5c656e20-2806-41bc-bed1-a27b32a6263f tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] [instance: e5438fb2-ab9d-4822-8d89-977264007e0f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 883.134089] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5c656e20-2806-41bc-bed1-a27b32a6263f tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 883.134089] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5c656e20-2806-41bc-bed1-a27b32a6263f tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 883.134089] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5c656e20-2806-41bc-bed1-a27b32a6263f tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 883.135162] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-5c656e20-2806-41bc-bed1-a27b32a6263f tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 883.135162] nova-conductor[52020]: Traceback (most recent call last): [ 883.135162] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 883.135162] nova-conductor[52020]: return func(*args, **kwargs) [ 883.135162] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 883.135162] nova-conductor[52020]: selections = self._select_destinations( [ 883.135162] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 883.135162] nova-conductor[52020]: selections = self._schedule( [ 883.135162] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 883.135162] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 883.135162] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 883.135162] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 883.135162] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 883.135162] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 883.135826] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-5c656e20-2806-41bc-bed1-a27b32a6263f tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] [instance: e5438fb2-ab9d-4822-8d89-977264007e0f] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager [None req-5027840d-2b0e-48e8-8d91-d627851a9bd4 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 884.222916] nova-conductor[52020]: Traceback (most recent call last): [ 884.222916] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 884.222916] nova-conductor[52020]: return func(*args, **kwargs) [ 884.222916] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 884.222916] nova-conductor[52020]: selections = self._select_destinations( [ 884.222916] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 884.222916] nova-conductor[52020]: selections = self._schedule( [ 884.222916] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 884.222916] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 884.222916] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 884.222916] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 884.222916] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager [ 884.222916] nova-conductor[52020]: ERROR nova.conductor.manager [ 884.237608] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5027840d-2b0e-48e8-8d91-d627851a9bd4 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 884.237916] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5027840d-2b0e-48e8-8d91-d627851a9bd4 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 884.238140] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5027840d-2b0e-48e8-8d91-d627851a9bd4 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 884.323057] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-5027840d-2b0e-48e8-8d91-d627851a9bd4 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] [instance: 5382cfb7-f75c-4310-905f-ad7b71738afe] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 884.324027] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5027840d-2b0e-48e8-8d91-d627851a9bd4 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 884.324248] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5027840d-2b0e-48e8-8d91-d627851a9bd4 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 884.324421] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5027840d-2b0e-48e8-8d91-d627851a9bd4 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 884.328535] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-5027840d-2b0e-48e8-8d91-d627851a9bd4 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 884.328535] nova-conductor[52020]: Traceback (most recent call last): [ 884.328535] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 884.328535] nova-conductor[52020]: return func(*args, **kwargs) [ 884.328535] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 884.328535] nova-conductor[52020]: selections = self._select_destinations( [ 884.328535] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 884.328535] nova-conductor[52020]: selections = self._schedule( [ 884.328535] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 884.328535] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 884.328535] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 884.328535] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 884.328535] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 884.328535] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 884.329380] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-5027840d-2b0e-48e8-8d91-d627851a9bd4 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] [instance: 5382cfb7-f75c-4310-905f-ad7b71738afe] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 885.057442] nova-conductor[52019]: ERROR nova.scheduler.utils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 19253198-cb6e-4c48-a88b-26780f3606e8 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 885.058069] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Rescheduling: True {{(pid=52019) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 885.058320] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 19253198-cb6e-4c48-a88b-26780f3606e8.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 19253198-cb6e-4c48-a88b-26780f3606e8. [ 885.058535] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 19253198-cb6e-4c48-a88b-26780f3606e8. [ 885.084466] nova-conductor[52019]: DEBUG nova.network.neutron [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] deallocate_for_instance() {{(pid=52019) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 885.110848] nova-conductor[52020]: DEBUG nova.db.main.api [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Created instance_extra for 5df12084-5dd6-41d1-9743-747f17ce3323 {{(pid=52020) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 885.166869] nova-conductor[52020]: DEBUG nova.db.main.api [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Created instance_extra for 2ed6496a-3e75-4cfd-88da-9e0b731f738a {{(pid=52020) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 885.181447] nova-conductor[52019]: DEBUG nova.network.neutron [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Instance cache missing network info. {{(pid=52019) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 885.187114] nova-conductor[52019]: DEBUG nova.network.neutron [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Updating instance_info_cache with network_info: [] {{(pid=52019) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 885.255891] nova-conductor[52020]: DEBUG nova.db.main.api [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Created instance_extra for 49d76773-e163-440b-aa99-08c379155149 {{(pid=52020) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 885.341653] nova-conductor[52020]: DEBUG nova.db.main.api [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Created instance_extra for dac8465a-592f-461c-af5b-49369eed5e70 {{(pid=52020) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 885.413341] nova-conductor[52019]: DEBUG nova.db.main.api [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Created instance_extra for 54605814-fdf4-43c7-9316-0d2594cdb5fa {{(pid=52019) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 885.477376] nova-conductor[52020]: DEBUG nova.db.main.api [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Created instance_extra for f196648e-0e82-4a01-91fc-af1ba61f0490 {{(pid=52020) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 885.555347] nova-conductor[52020]: DEBUG nova.db.main.api [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Created instance_extra for 66420486-d25e-457d-94cd-6f96fca2df7d {{(pid=52020) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager [None req-9bfa8f23-0542-4fc5-b924-fde2e4934263 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 885.603574] nova-conductor[52020]: Traceback (most recent call last): [ 885.603574] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 885.603574] nova-conductor[52020]: return func(*args, **kwargs) [ 885.603574] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 885.603574] nova-conductor[52020]: selections = self._select_destinations( [ 885.603574] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 885.603574] nova-conductor[52020]: selections = self._schedule( [ 885.603574] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 885.603574] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 885.603574] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 885.603574] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 885.603574] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager [ 885.603574] nova-conductor[52020]: ERROR nova.conductor.manager [ 885.612664] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9bfa8f23-0542-4fc5-b924-fde2e4934263 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 885.612895] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9bfa8f23-0542-4fc5-b924-fde2e4934263 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 885.613113] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9bfa8f23-0542-4fc5-b924-fde2e4934263 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 885.616214] nova-conductor[52019]: DEBUG nova.db.main.api [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Created instance_extra for a1dd5ca8-4210-4950-8a6e-a7e05b9a38a0 {{(pid=52019) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 885.660117] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-9bfa8f23-0542-4fc5-b924-fde2e4934263 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] [instance: d149565e-1a7e-4598-b771-e9cb65dc3004] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 885.660807] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9bfa8f23-0542-4fc5-b924-fde2e4934263 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 885.661355] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9bfa8f23-0542-4fc5-b924-fde2e4934263 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 885.661355] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-9bfa8f23-0542-4fc5-b924-fde2e4934263 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 885.664440] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-9bfa8f23-0542-4fc5-b924-fde2e4934263 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 885.664440] nova-conductor[52020]: Traceback (most recent call last): [ 885.664440] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 885.664440] nova-conductor[52020]: return func(*args, **kwargs) [ 885.664440] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 885.664440] nova-conductor[52020]: selections = self._select_destinations( [ 885.664440] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 885.664440] nova-conductor[52020]: selections = self._schedule( [ 885.664440] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 885.664440] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 885.664440] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 885.664440] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 885.664440] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 885.664440] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 885.664875] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-9bfa8f23-0542-4fc5-b924-fde2e4934263 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] [instance: d149565e-1a7e-4598-b771-e9cb65dc3004] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager [None req-7362637c-7d71-4e22-8b2c-50fa34785c48 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 886.407952] nova-conductor[52020]: Traceback (most recent call last): [ 886.407952] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 886.407952] nova-conductor[52020]: return func(*args, **kwargs) [ 886.407952] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 886.407952] nova-conductor[52020]: selections = self._select_destinations( [ 886.407952] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 886.407952] nova-conductor[52020]: selections = self._schedule( [ 886.407952] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 886.407952] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 886.407952] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 886.407952] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 886.407952] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager [ 886.407952] nova-conductor[52020]: ERROR nova.conductor.manager [ 886.414450] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7362637c-7d71-4e22-8b2c-50fa34785c48 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 886.414673] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7362637c-7d71-4e22-8b2c-50fa34785c48 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 886.414841] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7362637c-7d71-4e22-8b2c-50fa34785c48 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 886.453963] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-7362637c-7d71-4e22-8b2c-50fa34785c48 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] [instance: 94c33d75-7797-40f7-8dad-02d6e9a6bc1e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 886.454642] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7362637c-7d71-4e22-8b2c-50fa34785c48 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 886.454880] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7362637c-7d71-4e22-8b2c-50fa34785c48 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 886.455028] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-7362637c-7d71-4e22-8b2c-50fa34785c48 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 886.459850] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-7362637c-7d71-4e22-8b2c-50fa34785c48 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 886.459850] nova-conductor[52020]: Traceback (most recent call last): [ 886.459850] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 886.459850] nova-conductor[52020]: return func(*args, **kwargs) [ 886.459850] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 886.459850] nova-conductor[52020]: selections = self._select_destinations( [ 886.459850] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 886.459850] nova-conductor[52020]: selections = self._schedule( [ 886.459850] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 886.459850] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 886.459850] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 886.459850] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 886.459850] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 886.459850] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 886.459850] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-7362637c-7d71-4e22-8b2c-50fa34785c48 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] [instance: 94c33d75-7797-40f7-8dad-02d6e9a6bc1e] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager [None req-0634f099-8d5e-441e-a69a-210679a185d5 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 887.525550] nova-conductor[52020]: Traceback (most recent call last): [ 887.525550] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 887.525550] nova-conductor[52020]: return func(*args, **kwargs) [ 887.525550] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 887.525550] nova-conductor[52020]: selections = self._select_destinations( [ 887.525550] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 887.525550] nova-conductor[52020]: selections = self._schedule( [ 887.525550] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 887.525550] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 887.525550] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 887.525550] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 887.525550] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager [ 887.525550] nova-conductor[52020]: ERROR nova.conductor.manager [ 887.532169] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-0634f099-8d5e-441e-a69a-210679a185d5 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 887.532395] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-0634f099-8d5e-441e-a69a-210679a185d5 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 887.532564] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-0634f099-8d5e-441e-a69a-210679a185d5 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 887.570219] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-0634f099-8d5e-441e-a69a-210679a185d5 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] [instance: 56022297-6c1c-4dc5-bf0c-062a93fe7331] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 887.570940] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-0634f099-8d5e-441e-a69a-210679a185d5 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 887.571178] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-0634f099-8d5e-441e-a69a-210679a185d5 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 887.571345] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-0634f099-8d5e-441e-a69a-210679a185d5 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 887.574485] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-0634f099-8d5e-441e-a69a-210679a185d5 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 887.574485] nova-conductor[52020]: Traceback (most recent call last): [ 887.574485] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 887.574485] nova-conductor[52020]: return func(*args, **kwargs) [ 887.574485] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 887.574485] nova-conductor[52020]: selections = self._select_destinations( [ 887.574485] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 887.574485] nova-conductor[52020]: selections = self._schedule( [ 887.574485] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 887.574485] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 887.574485] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 887.574485] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 887.574485] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 887.574485] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 887.575007] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-0634f099-8d5e-441e-a69a-210679a185d5 tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] [instance: 56022297-6c1c-4dc5-bf0c-062a93fe7331] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager [None req-16b448cb-b681-4707-b551-e6d82c2b9eaa tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 888.898929] nova-conductor[52020]: Traceback (most recent call last): [ 888.898929] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 888.898929] nova-conductor[52020]: return func(*args, **kwargs) [ 888.898929] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 888.898929] nova-conductor[52020]: selections = self._select_destinations( [ 888.898929] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 888.898929] nova-conductor[52020]: selections = self._schedule( [ 888.898929] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 888.898929] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 888.898929] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 888.898929] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 888.898929] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager [ 888.898929] nova-conductor[52020]: ERROR nova.conductor.manager [ 888.906846] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-16b448cb-b681-4707-b551-e6d82c2b9eaa tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 888.906846] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-16b448cb-b681-4707-b551-e6d82c2b9eaa tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 888.906846] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-16b448cb-b681-4707-b551-e6d82c2b9eaa tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 888.955559] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-16b448cb-b681-4707-b551-e6d82c2b9eaa tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] [instance: 194814ac-60d1-4f7e-abd3-a1c8dc5843fd] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 888.956325] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-16b448cb-b681-4707-b551-e6d82c2b9eaa tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 888.956541] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-16b448cb-b681-4707-b551-e6d82c2b9eaa tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 888.956708] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-16b448cb-b681-4707-b551-e6d82c2b9eaa tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 888.961023] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-16b448cb-b681-4707-b551-e6d82c2b9eaa tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 888.961023] nova-conductor[52020]: Traceback (most recent call last): [ 888.961023] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 888.961023] nova-conductor[52020]: return func(*args, **kwargs) [ 888.961023] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 888.961023] nova-conductor[52020]: selections = self._select_destinations( [ 888.961023] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 888.961023] nova-conductor[52020]: selections = self._schedule( [ 888.961023] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 888.961023] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 888.961023] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 888.961023] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 888.961023] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 888.961023] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 888.961023] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-16b448cb-b681-4707-b551-e6d82c2b9eaa tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] [instance: 194814ac-60d1-4f7e-abd3-a1c8dc5843fd] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager [None req-99a25cff-91af-4924-ba85-eb5648a0307d tempest-ServersTestJSON-394239745 tempest-ServersTestJSON-394239745-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 891.865625] nova-conductor[52019]: Traceback (most recent call last): [ 891.865625] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 891.865625] nova-conductor[52019]: return func(*args, **kwargs) [ 891.865625] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 891.865625] nova-conductor[52019]: selections = self._select_destinations( [ 891.865625] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 891.865625] nova-conductor[52019]: selections = self._schedule( [ 891.865625] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 891.865625] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 891.865625] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 891.865625] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 891.865625] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager [ 891.865625] nova-conductor[52019]: ERROR nova.conductor.manager [ 891.883138] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99a25cff-91af-4924-ba85-eb5648a0307d tempest-ServersTestJSON-394239745 tempest-ServersTestJSON-394239745-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 891.883274] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99a25cff-91af-4924-ba85-eb5648a0307d tempest-ServersTestJSON-394239745 tempest-ServersTestJSON-394239745-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 891.883421] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99a25cff-91af-4924-ba85-eb5648a0307d tempest-ServersTestJSON-394239745 tempest-ServersTestJSON-394239745-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 891.952506] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-99a25cff-91af-4924-ba85-eb5648a0307d tempest-ServersTestJSON-394239745 tempest-ServersTestJSON-394239745-project-member] [instance: 947e0174-fba0-496a-b314-217a49464527] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 891.953105] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99a25cff-91af-4924-ba85-eb5648a0307d tempest-ServersTestJSON-394239745 tempest-ServersTestJSON-394239745-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 891.953323] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99a25cff-91af-4924-ba85-eb5648a0307d tempest-ServersTestJSON-394239745 tempest-ServersTestJSON-394239745-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 891.953488] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-99a25cff-91af-4924-ba85-eb5648a0307d tempest-ServersTestJSON-394239745 tempest-ServersTestJSON-394239745-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 891.957097] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-99a25cff-91af-4924-ba85-eb5648a0307d tempest-ServersTestJSON-394239745 tempest-ServersTestJSON-394239745-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 891.957097] nova-conductor[52019]: Traceback (most recent call last): [ 891.957097] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 891.957097] nova-conductor[52019]: return func(*args, **kwargs) [ 891.957097] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 891.957097] nova-conductor[52019]: selections = self._select_destinations( [ 891.957097] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 891.957097] nova-conductor[52019]: selections = self._schedule( [ 891.957097] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 891.957097] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 891.957097] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 891.957097] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 891.957097] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 891.957097] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 891.957952] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-99a25cff-91af-4924-ba85-eb5648a0307d tempest-ServersTestJSON-394239745 tempest-ServersTestJSON-394239745-project-member] [instance: 947e0174-fba0-496a-b314-217a49464527] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager [None req-5d235344-205a-47ec-8152-3594f4a983a9 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 896.827898] nova-conductor[52020]: Traceback (most recent call last): [ 896.827898] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 896.827898] nova-conductor[52020]: return func(*args, **kwargs) [ 896.827898] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 896.827898] nova-conductor[52020]: selections = self._select_destinations( [ 896.827898] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 896.827898] nova-conductor[52020]: selections = self._schedule( [ 896.827898] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 896.827898] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 896.827898] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 896.827898] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 896.827898] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager [ 896.827898] nova-conductor[52020]: ERROR nova.conductor.manager [ 896.843782] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5d235344-205a-47ec-8152-3594f4a983a9 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 896.844023] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5d235344-205a-47ec-8152-3594f4a983a9 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 896.844199] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5d235344-205a-47ec-8152-3594f4a983a9 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 896.890268] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-5d235344-205a-47ec-8152-3594f4a983a9 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] [instance: 229f87c6-59f3-4844-acf9-0e2d86a09e7c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 896.890982] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5d235344-205a-47ec-8152-3594f4a983a9 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 896.891449] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5d235344-205a-47ec-8152-3594f4a983a9 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 896.891449] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5d235344-205a-47ec-8152-3594f4a983a9 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 896.897118] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-5d235344-205a-47ec-8152-3594f4a983a9 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 896.897118] nova-conductor[52020]: Traceback (most recent call last): [ 896.897118] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 896.897118] nova-conductor[52020]: return func(*args, **kwargs) [ 896.897118] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 896.897118] nova-conductor[52020]: selections = self._select_destinations( [ 896.897118] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 896.897118] nova-conductor[52020]: selections = self._schedule( [ 896.897118] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 896.897118] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 896.897118] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 896.897118] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 896.897118] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 896.897118] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 896.897118] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-5d235344-205a-47ec-8152-3594f4a983a9 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] [instance: 229f87c6-59f3-4844-acf9-0e2d86a09e7c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 896.935587] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5d235344-205a-47ec-8152-3594f4a983a9 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 896.935587] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5d235344-205a-47ec-8152-3594f4a983a9 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 896.935587] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5d235344-205a-47ec-8152-3594f4a983a9 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 896.978558] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-5d235344-205a-47ec-8152-3594f4a983a9 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] [instance: 01d6cc22-a864-4096-9799-770464b00baf] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 896.979287] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5d235344-205a-47ec-8152-3594f4a983a9 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 896.979491] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5d235344-205a-47ec-8152-3594f4a983a9 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 896.979652] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-5d235344-205a-47ec-8152-3594f4a983a9 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 896.983199] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-5d235344-205a-47ec-8152-3594f4a983a9 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 896.983199] nova-conductor[52020]: Traceback (most recent call last): [ 896.983199] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 896.983199] nova-conductor[52020]: return func(*args, **kwargs) [ 896.983199] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 896.983199] nova-conductor[52020]: selections = self._select_destinations( [ 896.983199] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 896.983199] nova-conductor[52020]: selections = self._schedule( [ 896.983199] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 896.983199] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 896.983199] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 896.983199] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 896.983199] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 896.983199] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 896.985497] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-5d235344-205a-47ec-8152-3594f4a983a9 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] [instance: 01d6cc22-a864-4096-9799-770464b00baf] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 897.558744] nova-conductor[52019]: DEBUG nova.db.main.api [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Created instance_extra for a6ff207e-a925-46d1-9aaf-e06268d3c6f2 {{(pid=52019) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 899.690210] nova-conductor[52019]: Traceback (most recent call last): [ 899.690210] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 899.690210] nova-conductor[52019]: return func(*args, **kwargs) [ 899.690210] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 899.690210] nova-conductor[52019]: selections = self._select_destinations( [ 899.690210] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 899.690210] nova-conductor[52019]: selections = self._schedule( [ 899.690210] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 899.690210] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 899.690210] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 899.690210] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 899.690210] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager [ 899.690210] nova-conductor[52019]: ERROR nova.conductor.manager [ 899.698398] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 899.698629] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 899.698803] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 899.748273] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] [instance: 13266282-b6e8-4de0-87c4-df45db115cff] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 899.748273] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 899.748273] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 899.748273] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 899.749768] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 899.749768] nova-conductor[52019]: Traceback (most recent call last): [ 899.749768] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 899.749768] nova-conductor[52019]: return func(*args, **kwargs) [ 899.749768] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 899.749768] nova-conductor[52019]: selections = self._select_destinations( [ 899.749768] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 899.749768] nova-conductor[52019]: selections = self._schedule( [ 899.749768] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 899.749768] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 899.749768] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 899.749768] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 899.749768] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 899.749768] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 899.753857] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] [instance: 13266282-b6e8-4de0-87c4-df45db115cff] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 899.776457] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 899.776759] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 899.776939] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 899.817182] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] [instance: 7a87fc7f-b725-4eb9-98d6-6e3b1688a6be] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 899.817479] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 899.817584] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 899.817737] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 899.823482] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 899.823482] nova-conductor[52019]: Traceback (most recent call last): [ 899.823482] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 899.823482] nova-conductor[52019]: return func(*args, **kwargs) [ 899.823482] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 899.823482] nova-conductor[52019]: selections = self._select_destinations( [ 899.823482] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 899.823482] nova-conductor[52019]: selections = self._schedule( [ 899.823482] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 899.823482] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 899.823482] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 899.823482] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 899.823482] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 899.823482] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 899.824099] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] [instance: 7a87fc7f-b725-4eb9-98d6-6e3b1688a6be] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 899.846642] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 899.846746] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 899.846865] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 899.893986] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] [instance: 98c527da-012c-4d9c-a9db-b75fec5b426a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 899.895173] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 899.895173] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 899.895173] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 899.898891] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 899.898891] nova-conductor[52019]: Traceback (most recent call last): [ 899.898891] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 899.898891] nova-conductor[52019]: return func(*args, **kwargs) [ 899.898891] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 899.898891] nova-conductor[52019]: selections = self._select_destinations( [ 899.898891] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 899.898891] nova-conductor[52019]: selections = self._schedule( [ 899.898891] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 899.898891] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 899.898891] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 899.898891] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 899.898891] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 899.898891] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 899.899492] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-fb60e5eb-3cc1-46b0-ac5e-cb93d08eb6d8 tempest-ListServersNegativeTestJSON-891921015 tempest-ListServersNegativeTestJSON-891921015-project-member] [instance: 98c527da-012c-4d9c-a9db-b75fec5b426a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager [None req-65480d0b-566c-4b72-af99-f13237b9db86 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 901.099590] nova-conductor[52019]: Traceback (most recent call last): [ 901.099590] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 901.099590] nova-conductor[52019]: return func(*args, **kwargs) [ 901.099590] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 901.099590] nova-conductor[52019]: selections = self._select_destinations( [ 901.099590] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 901.099590] nova-conductor[52019]: selections = self._schedule( [ 901.099590] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 901.099590] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 901.099590] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 901.099590] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 901.099590] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager [ 901.099590] nova-conductor[52019]: ERROR nova.conductor.manager [ 901.106941] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-65480d0b-566c-4b72-af99-f13237b9db86 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 901.107184] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-65480d0b-566c-4b72-af99-f13237b9db86 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 901.107356] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-65480d0b-566c-4b72-af99-f13237b9db86 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 901.145011] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-65480d0b-566c-4b72-af99-f13237b9db86 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] [instance: 20b985de-cc32-41ea-b090-385251708ce5] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 901.145708] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-65480d0b-566c-4b72-af99-f13237b9db86 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 901.145917] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-65480d0b-566c-4b72-af99-f13237b9db86 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 901.146123] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-65480d0b-566c-4b72-af99-f13237b9db86 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 901.148754] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-65480d0b-566c-4b72-af99-f13237b9db86 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 901.148754] nova-conductor[52019]: Traceback (most recent call last): [ 901.148754] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 901.148754] nova-conductor[52019]: return func(*args, **kwargs) [ 901.148754] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 901.148754] nova-conductor[52019]: selections = self._select_destinations( [ 901.148754] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 901.148754] nova-conductor[52019]: selections = self._schedule( [ 901.148754] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 901.148754] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 901.148754] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 901.148754] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 901.148754] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 901.148754] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 901.149277] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-65480d0b-566c-4b72-af99-f13237b9db86 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] [instance: 20b985de-cc32-41ea-b090-385251708ce5] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 901.170972] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-65480d0b-566c-4b72-af99-f13237b9db86 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 901.170972] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-65480d0b-566c-4b72-af99-f13237b9db86 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 901.170972] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-65480d0b-566c-4b72-af99-f13237b9db86 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 901.210316] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-65480d0b-566c-4b72-af99-f13237b9db86 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] [instance: ba0f9256-5117-4e3c-9aca-7d2457d8f064] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 901.211015] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-65480d0b-566c-4b72-af99-f13237b9db86 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 901.211246] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-65480d0b-566c-4b72-af99-f13237b9db86 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 901.211415] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-65480d0b-566c-4b72-af99-f13237b9db86 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 901.214274] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-65480d0b-566c-4b72-af99-f13237b9db86 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 901.214274] nova-conductor[52019]: Traceback (most recent call last): [ 901.214274] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 901.214274] nova-conductor[52019]: return func(*args, **kwargs) [ 901.214274] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 901.214274] nova-conductor[52019]: selections = self._select_destinations( [ 901.214274] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 901.214274] nova-conductor[52019]: selections = self._schedule( [ 901.214274] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 901.214274] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 901.214274] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 901.214274] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 901.214274] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 901.214274] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 901.214795] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-65480d0b-566c-4b72-af99-f13237b9db86 tempest-MultipleCreateTestJSON-1627576137 tempest-MultipleCreateTestJSON-1627576137-project-member] [instance: ba0f9256-5117-4e3c-9aca-7d2457d8f064] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 908.587856] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52020) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 908.599925] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 908.600171] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 908.600348] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.633749] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 908.633749] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 908.633749] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.634062] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 908.634250] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 908.634410] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.642912] nova-conductor[52020]: DEBUG nova.quota [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Getting quotas for project cc08c67065e0450e87f01130f1571b3f. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 908.645277] nova-conductor[52020]: DEBUG nova.quota [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Getting quotas for user c5725815ee2c4c67bc5cdc3384140761 and project cc08c67065e0450e87f01130f1571b3f. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 908.651106] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52020) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 908.651326] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 908.651522] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 908.651690] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.655448] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 908.656078] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 908.656279] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 908.656443] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.681764] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 908.682025] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 908.682234] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 931.464105] nova-conductor[52019]: ERROR nova.scheduler.utils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 931.464616] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Rescheduling: True {{(pid=52019) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 931.464844] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd. [ 931.465061] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd. [ 931.488931] nova-conductor[52019]: DEBUG nova.network.neutron [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] deallocate_for_instance() {{(pid=52019) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 931.505441] nova-conductor[52019]: DEBUG nova.network.neutron [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Instance cache missing network info. {{(pid=52019) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 931.508379] nova-conductor[52019]: DEBUG nova.network.neutron [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Updating instance_info_cache with network_info: [] {{(pid=52019) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 938.522943] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52019) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 938.537146] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 938.537473] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 938.537710] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 938.572164] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 938.572428] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 938.572602] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 938.572955] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 938.573156] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 938.573315] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 938.585990] nova-conductor[52019]: DEBUG nova.quota [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Getting quotas for project ea06d39d80ab4c7db76925f3550795fa. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 938.588523] nova-conductor[52019]: DEBUG nova.quota [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Getting quotas for user 578f85992eb84f0fb6aca0e5e23bdd06 and project ea06d39d80ab4c7db76925f3550795fa. Resources: {'instances', 'ram', 'cores'} {{(pid=52019) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 938.594634] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52019) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 938.595252] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 938.595323] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 938.595441] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 938.598187] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 938.598890] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 938.599106] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 938.599273] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 938.611118] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 938.611320] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 938.611484] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 948.879032] nova-conductor[52020]: DEBUG nova.db.main.api [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Created instance_extra for 03742e11-0fb2-48e2-9093-77ea7b647bf3 {{(pid=52020) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 978.142651] nova-conductor[52019]: ERROR nova.scheduler.utils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 50ff2169-9c1f-4f7a-b365-1949dac57f86 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 978.143281] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Rescheduling: True {{(pid=52019) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 978.143521] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 50ff2169-9c1f-4f7a-b365-1949dac57f86.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 50ff2169-9c1f-4f7a-b365-1949dac57f86. [ 978.143847] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 50ff2169-9c1f-4f7a-b365-1949dac57f86. [ 978.165113] nova-conductor[52019]: DEBUG nova.network.neutron [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] deallocate_for_instance() {{(pid=52019) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 978.182870] nova-conductor[52019]: DEBUG nova.network.neutron [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Instance cache missing network info. {{(pid=52019) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 978.185765] nova-conductor[52019]: DEBUG nova.network.neutron [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Updating instance_info_cache with network_info: [] {{(pid=52019) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager [None req-76471a14-d602-4b25-8726-2a29163aff5d tempest-ServerDiagnosticsV248Test-1220482493 tempest-ServerDiagnosticsV248Test-1220482493-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 987.323276] nova-conductor[52020]: Traceback (most recent call last): [ 987.323276] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 987.323276] nova-conductor[52020]: return func(*args, **kwargs) [ 987.323276] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 987.323276] nova-conductor[52020]: selections = self._select_destinations( [ 987.323276] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 987.323276] nova-conductor[52020]: selections = self._schedule( [ 987.323276] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 987.323276] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 987.323276] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 987.323276] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 987.323276] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager [ 987.323276] nova-conductor[52020]: ERROR nova.conductor.manager [ 987.330629] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-76471a14-d602-4b25-8726-2a29163aff5d tempest-ServerDiagnosticsV248Test-1220482493 tempest-ServerDiagnosticsV248Test-1220482493-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 987.330977] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-76471a14-d602-4b25-8726-2a29163aff5d tempest-ServerDiagnosticsV248Test-1220482493 tempest-ServerDiagnosticsV248Test-1220482493-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 987.331075] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-76471a14-d602-4b25-8726-2a29163aff5d tempest-ServerDiagnosticsV248Test-1220482493 tempest-ServerDiagnosticsV248Test-1220482493-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 987.367857] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-76471a14-d602-4b25-8726-2a29163aff5d tempest-ServerDiagnosticsV248Test-1220482493 tempest-ServerDiagnosticsV248Test-1220482493-project-member] [instance: 74e7c008-ec41-452f-99aa-669e36630b90] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 987.368560] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-76471a14-d602-4b25-8726-2a29163aff5d tempest-ServerDiagnosticsV248Test-1220482493 tempest-ServerDiagnosticsV248Test-1220482493-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 987.368765] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-76471a14-d602-4b25-8726-2a29163aff5d tempest-ServerDiagnosticsV248Test-1220482493 tempest-ServerDiagnosticsV248Test-1220482493-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 987.369022] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-76471a14-d602-4b25-8726-2a29163aff5d tempest-ServerDiagnosticsV248Test-1220482493 tempest-ServerDiagnosticsV248Test-1220482493-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 987.371768] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-76471a14-d602-4b25-8726-2a29163aff5d tempest-ServerDiagnosticsV248Test-1220482493 tempest-ServerDiagnosticsV248Test-1220482493-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 987.371768] nova-conductor[52020]: Traceback (most recent call last): [ 987.371768] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 987.371768] nova-conductor[52020]: return func(*args, **kwargs) [ 987.371768] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 987.371768] nova-conductor[52020]: selections = self._select_destinations( [ 987.371768] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 987.371768] nova-conductor[52020]: selections = self._schedule( [ 987.371768] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 987.371768] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 987.371768] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 987.371768] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 987.371768] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 987.371768] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 987.372489] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-76471a14-d602-4b25-8726-2a29163aff5d tempest-ServerDiagnosticsV248Test-1220482493 tempest-ServerDiagnosticsV248Test-1220482493-project-member] [instance: 74e7c008-ec41-452f-99aa-669e36630b90] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager [None req-f8a262a1-1b20-4f66-b924-7619bb7cb28e tempest-AttachVolumeTestJSON-413909156 tempest-AttachVolumeTestJSON-413909156-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 991.384243] nova-conductor[52019]: Traceback (most recent call last): [ 991.384243] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 991.384243] nova-conductor[52019]: return func(*args, **kwargs) [ 991.384243] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 991.384243] nova-conductor[52019]: selections = self._select_destinations( [ 991.384243] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 991.384243] nova-conductor[52019]: selections = self._schedule( [ 991.384243] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 991.384243] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 991.384243] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 991.384243] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 991.384243] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager [ 991.384243] nova-conductor[52019]: ERROR nova.conductor.manager [ 991.394266] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f8a262a1-1b20-4f66-b924-7619bb7cb28e tempest-AttachVolumeTestJSON-413909156 tempest-AttachVolumeTestJSON-413909156-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 991.394498] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f8a262a1-1b20-4f66-b924-7619bb7cb28e tempest-AttachVolumeTestJSON-413909156 tempest-AttachVolumeTestJSON-413909156-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 991.394667] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f8a262a1-1b20-4f66-b924-7619bb7cb28e tempest-AttachVolumeTestJSON-413909156 tempest-AttachVolumeTestJSON-413909156-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 991.445089] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-f8a262a1-1b20-4f66-b924-7619bb7cb28e tempest-AttachVolumeTestJSON-413909156 tempest-AttachVolumeTestJSON-413909156-project-member] [instance: 0f73f6db-1421-4836-9087-b0d040124ec9] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 991.445773] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f8a262a1-1b20-4f66-b924-7619bb7cb28e tempest-AttachVolumeTestJSON-413909156 tempest-AttachVolumeTestJSON-413909156-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 991.445981] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f8a262a1-1b20-4f66-b924-7619bb7cb28e tempest-AttachVolumeTestJSON-413909156 tempest-AttachVolumeTestJSON-413909156-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 991.446202] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-f8a262a1-1b20-4f66-b924-7619bb7cb28e tempest-AttachVolumeTestJSON-413909156 tempest-AttachVolumeTestJSON-413909156-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 991.449047] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-f8a262a1-1b20-4f66-b924-7619bb7cb28e tempest-AttachVolumeTestJSON-413909156 tempest-AttachVolumeTestJSON-413909156-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 991.449047] nova-conductor[52019]: Traceback (most recent call last): [ 991.449047] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 991.449047] nova-conductor[52019]: return func(*args, **kwargs) [ 991.449047] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 991.449047] nova-conductor[52019]: selections = self._select_destinations( [ 991.449047] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 991.449047] nova-conductor[52019]: selections = self._schedule( [ 991.449047] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 991.449047] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 991.449047] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 991.449047] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 991.449047] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 991.449047] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 991.449657] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-f8a262a1-1b20-4f66-b924-7619bb7cb28e tempest-AttachVolumeTestJSON-413909156 tempest-AttachVolumeTestJSON-413909156-project-member] [instance: 0f73f6db-1421-4836-9087-b0d040124ec9] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager [None req-d48f79b5-cebe-4086-974f-e311f605028b tempest-AttachVolumeTestJSON-413909156 tempest-AttachVolumeTestJSON-413909156-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 995.675420] nova-conductor[52020]: Traceback (most recent call last): [ 995.675420] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 995.675420] nova-conductor[52020]: return func(*args, **kwargs) [ 995.675420] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 995.675420] nova-conductor[52020]: selections = self._select_destinations( [ 995.675420] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 995.675420] nova-conductor[52020]: selections = self._schedule( [ 995.675420] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 995.675420] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 995.675420] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 995.675420] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 995.675420] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager [ 995.675420] nova-conductor[52020]: ERROR nova.conductor.manager [ 995.682207] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-d48f79b5-cebe-4086-974f-e311f605028b tempest-AttachVolumeTestJSON-413909156 tempest-AttachVolumeTestJSON-413909156-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 995.682722] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-d48f79b5-cebe-4086-974f-e311f605028b tempest-AttachVolumeTestJSON-413909156 tempest-AttachVolumeTestJSON-413909156-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 995.682903] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-d48f79b5-cebe-4086-974f-e311f605028b tempest-AttachVolumeTestJSON-413909156 tempest-AttachVolumeTestJSON-413909156-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 995.743992] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-d48f79b5-cebe-4086-974f-e311f605028b tempest-AttachVolumeTestJSON-413909156 tempest-AttachVolumeTestJSON-413909156-project-member] [instance: d8d0c576-992f-4507-99c7-39b8e19a8b65] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 995.744662] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-d48f79b5-cebe-4086-974f-e311f605028b tempest-AttachVolumeTestJSON-413909156 tempest-AttachVolumeTestJSON-413909156-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 995.744873] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-d48f79b5-cebe-4086-974f-e311f605028b tempest-AttachVolumeTestJSON-413909156 tempest-AttachVolumeTestJSON-413909156-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 995.745057] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-d48f79b5-cebe-4086-974f-e311f605028b tempest-AttachVolumeTestJSON-413909156 tempest-AttachVolumeTestJSON-413909156-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 995.749318] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-d48f79b5-cebe-4086-974f-e311f605028b tempest-AttachVolumeTestJSON-413909156 tempest-AttachVolumeTestJSON-413909156-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 995.749318] nova-conductor[52020]: Traceback (most recent call last): [ 995.749318] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 995.749318] nova-conductor[52020]: return func(*args, **kwargs) [ 995.749318] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 995.749318] nova-conductor[52020]: selections = self._select_destinations( [ 995.749318] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 995.749318] nova-conductor[52020]: selections = self._schedule( [ 995.749318] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 995.749318] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 995.749318] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 995.749318] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 995.749318] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 995.749318] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 995.749318] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-d48f79b5-cebe-4086-974f-e311f605028b tempest-AttachVolumeTestJSON-413909156 tempest-AttachVolumeTestJSON-413909156-project-member] [instance: d8d0c576-992f-4507-99c7-39b8e19a8b65] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 997.888569] nova-conductor[52019]: DEBUG nova.db.main.api [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Created instance_extra for 05010bc2-c30a-49bf-8daa-3eec6a5e9022 {{(pid=52019) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager [None req-23f7c52b-f77f-4bc4-b476-088bbb547536 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 997.992960] nova-conductor[52020]: Traceback (most recent call last): [ 997.992960] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 997.992960] nova-conductor[52020]: return func(*args, **kwargs) [ 997.992960] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 997.992960] nova-conductor[52020]: selections = self._select_destinations( [ 997.992960] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 997.992960] nova-conductor[52020]: selections = self._schedule( [ 997.992960] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 997.992960] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 997.992960] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 997.992960] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 997.992960] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager [ 997.992960] nova-conductor[52020]: ERROR nova.conductor.manager [ 998.003156] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-23f7c52b-f77f-4bc4-b476-088bbb547536 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 998.003396] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-23f7c52b-f77f-4bc4-b476-088bbb547536 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 998.003509] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-23f7c52b-f77f-4bc4-b476-088bbb547536 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 998.059425] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-23f7c52b-f77f-4bc4-b476-088bbb547536 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: fda2a05c-7bce-4250-8182-d2f8c41fca8b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 998.060193] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-23f7c52b-f77f-4bc4-b476-088bbb547536 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 998.060370] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-23f7c52b-f77f-4bc4-b476-088bbb547536 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 998.060541] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-23f7c52b-f77f-4bc4-b476-088bbb547536 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 998.064382] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-23f7c52b-f77f-4bc4-b476-088bbb547536 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 998.064382] nova-conductor[52020]: Traceback (most recent call last): [ 998.064382] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 998.064382] nova-conductor[52020]: return func(*args, **kwargs) [ 998.064382] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 998.064382] nova-conductor[52020]: selections = self._select_destinations( [ 998.064382] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 998.064382] nova-conductor[52020]: selections = self._schedule( [ 998.064382] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 998.064382] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 998.064382] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 998.064382] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 998.064382] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 998.064382] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 998.064909] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-23f7c52b-f77f-4bc4-b476-088bbb547536 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: fda2a05c-7bce-4250-8182-d2f8c41fca8b] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager [None req-dd46b735-0409-41ee-bf7a-66c6b7d6bf12 tempest-ServerMetadataNegativeTestJSON-312686674 tempest-ServerMetadataNegativeTestJSON-312686674-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1003.045733] nova-conductor[52020]: Traceback (most recent call last): [ 1003.045733] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1003.045733] nova-conductor[52020]: return func(*args, **kwargs) [ 1003.045733] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1003.045733] nova-conductor[52020]: selections = self._select_destinations( [ 1003.045733] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1003.045733] nova-conductor[52020]: selections = self._schedule( [ 1003.045733] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1003.045733] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 1003.045733] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1003.045733] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 1003.045733] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager [ 1003.045733] nova-conductor[52020]: ERROR nova.conductor.manager [ 1003.053191] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-dd46b735-0409-41ee-bf7a-66c6b7d6bf12 tempest-ServerMetadataNegativeTestJSON-312686674 tempest-ServerMetadataNegativeTestJSON-312686674-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1003.053419] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-dd46b735-0409-41ee-bf7a-66c6b7d6bf12 tempest-ServerMetadataNegativeTestJSON-312686674 tempest-ServerMetadataNegativeTestJSON-312686674-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1003.053588] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-dd46b735-0409-41ee-bf7a-66c6b7d6bf12 tempest-ServerMetadataNegativeTestJSON-312686674 tempest-ServerMetadataNegativeTestJSON-312686674-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1003.090960] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-dd46b735-0409-41ee-bf7a-66c6b7d6bf12 tempest-ServerMetadataNegativeTestJSON-312686674 tempest-ServerMetadataNegativeTestJSON-312686674-project-member] [instance: f13dcee8-331c-4107-b044-0709b15c9ed0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1003.091639] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-dd46b735-0409-41ee-bf7a-66c6b7d6bf12 tempest-ServerMetadataNegativeTestJSON-312686674 tempest-ServerMetadataNegativeTestJSON-312686674-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1003.092093] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-dd46b735-0409-41ee-bf7a-66c6b7d6bf12 tempest-ServerMetadataNegativeTestJSON-312686674 tempest-ServerMetadataNegativeTestJSON-312686674-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1003.092188] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-dd46b735-0409-41ee-bf7a-66c6b7d6bf12 tempest-ServerMetadataNegativeTestJSON-312686674 tempest-ServerMetadataNegativeTestJSON-312686674-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1003.095114] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-dd46b735-0409-41ee-bf7a-66c6b7d6bf12 tempest-ServerMetadataNegativeTestJSON-312686674 tempest-ServerMetadataNegativeTestJSON-312686674-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1003.095114] nova-conductor[52020]: Traceback (most recent call last): [ 1003.095114] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1003.095114] nova-conductor[52020]: return func(*args, **kwargs) [ 1003.095114] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1003.095114] nova-conductor[52020]: selections = self._select_destinations( [ 1003.095114] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1003.095114] nova-conductor[52020]: selections = self._schedule( [ 1003.095114] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1003.095114] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 1003.095114] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1003.095114] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 1003.095114] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1003.095114] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1003.095634] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-dd46b735-0409-41ee-bf7a-66c6b7d6bf12 tempest-ServerMetadataNegativeTestJSON-312686674 tempest-ServerMetadataNegativeTestJSON-312686674-project-member] [instance: f13dcee8-331c-4107-b044-0709b15c9ed0] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager [None req-572b1664-6695-4077-a4cc-bc3d221337f4 tempest-ServersTestManualDisk-761691693 tempest-ServersTestManualDisk-761691693-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1006.581231] nova-conductor[52019]: Traceback (most recent call last): [ 1006.581231] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1006.581231] nova-conductor[52019]: return func(*args, **kwargs) [ 1006.581231] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1006.581231] nova-conductor[52019]: selections = self._select_destinations( [ 1006.581231] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1006.581231] nova-conductor[52019]: selections = self._schedule( [ 1006.581231] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1006.581231] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 1006.581231] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1006.581231] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 1006.581231] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager [ 1006.581231] nova-conductor[52019]: ERROR nova.conductor.manager [ 1006.591900] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-572b1664-6695-4077-a4cc-bc3d221337f4 tempest-ServersTestManualDisk-761691693 tempest-ServersTestManualDisk-761691693-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1006.592194] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-572b1664-6695-4077-a4cc-bc3d221337f4 tempest-ServersTestManualDisk-761691693 tempest-ServersTestManualDisk-761691693-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1006.592413] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-572b1664-6695-4077-a4cc-bc3d221337f4 tempest-ServersTestManualDisk-761691693 tempest-ServersTestManualDisk-761691693-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager [None req-71fba461-e8a8-46ab-99bf-6c6664f0a400 tempest-ServersNegativeTestMultiTenantJSON-1694403878 tempest-ServersNegativeTestMultiTenantJSON-1694403878-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1006.608468] nova-conductor[52020]: Traceback (most recent call last): [ 1006.608468] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1006.608468] nova-conductor[52020]: return func(*args, **kwargs) [ 1006.608468] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1006.608468] nova-conductor[52020]: selections = self._select_destinations( [ 1006.608468] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1006.608468] nova-conductor[52020]: selections = self._schedule( [ 1006.608468] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1006.608468] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 1006.608468] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1006.608468] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 1006.608468] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager [ 1006.608468] nova-conductor[52020]: ERROR nova.conductor.manager [ 1006.614828] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-71fba461-e8a8-46ab-99bf-6c6664f0a400 tempest-ServersNegativeTestMultiTenantJSON-1694403878 tempest-ServersNegativeTestMultiTenantJSON-1694403878-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1006.615054] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-71fba461-e8a8-46ab-99bf-6c6664f0a400 tempest-ServersNegativeTestMultiTenantJSON-1694403878 tempest-ServersNegativeTestMultiTenantJSON-1694403878-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1006.615161] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-71fba461-e8a8-46ab-99bf-6c6664f0a400 tempest-ServersNegativeTestMultiTenantJSON-1694403878 tempest-ServersNegativeTestMultiTenantJSON-1694403878-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1006.659365] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-572b1664-6695-4077-a4cc-bc3d221337f4 tempest-ServersTestManualDisk-761691693 tempest-ServersTestManualDisk-761691693-project-member] [instance: c3af4bbf-d1b6-4fe6-b6f3-077118ffffc6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1006.660098] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-572b1664-6695-4077-a4cc-bc3d221337f4 tempest-ServersTestManualDisk-761691693 tempest-ServersTestManualDisk-761691693-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1006.660331] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-572b1664-6695-4077-a4cc-bc3d221337f4 tempest-ServersTestManualDisk-761691693 tempest-ServersTestManualDisk-761691693-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1006.660827] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-572b1664-6695-4077-a4cc-bc3d221337f4 tempest-ServersTestManualDisk-761691693 tempest-ServersTestManualDisk-761691693-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1006.664570] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-572b1664-6695-4077-a4cc-bc3d221337f4 tempest-ServersTestManualDisk-761691693 tempest-ServersTestManualDisk-761691693-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1006.664570] nova-conductor[52019]: Traceback (most recent call last): [ 1006.664570] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1006.664570] nova-conductor[52019]: return func(*args, **kwargs) [ 1006.664570] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1006.664570] nova-conductor[52019]: selections = self._select_destinations( [ 1006.664570] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1006.664570] nova-conductor[52019]: selections = self._schedule( [ 1006.664570] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1006.664570] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 1006.664570] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1006.664570] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 1006.664570] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1006.664570] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1006.664570] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-572b1664-6695-4077-a4cc-bc3d221337f4 tempest-ServersTestManualDisk-761691693 tempest-ServersTestManualDisk-761691693-project-member] [instance: c3af4bbf-d1b6-4fe6-b6f3-077118ffffc6] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1006.673975] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-71fba461-e8a8-46ab-99bf-6c6664f0a400 tempest-ServersNegativeTestMultiTenantJSON-1694403878 tempest-ServersNegativeTestMultiTenantJSON-1694403878-project-member] [instance: 1088641d-dcb7-4fb3-9fce-6edf044eba09] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1006.673975] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-71fba461-e8a8-46ab-99bf-6c6664f0a400 tempest-ServersNegativeTestMultiTenantJSON-1694403878 tempest-ServersNegativeTestMultiTenantJSON-1694403878-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1006.673975] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-71fba461-e8a8-46ab-99bf-6c6664f0a400 tempest-ServersNegativeTestMultiTenantJSON-1694403878 tempest-ServersNegativeTestMultiTenantJSON-1694403878-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1006.673975] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-71fba461-e8a8-46ab-99bf-6c6664f0a400 tempest-ServersNegativeTestMultiTenantJSON-1694403878 tempest-ServersNegativeTestMultiTenantJSON-1694403878-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1006.676494] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-71fba461-e8a8-46ab-99bf-6c6664f0a400 tempest-ServersNegativeTestMultiTenantJSON-1694403878 tempest-ServersNegativeTestMultiTenantJSON-1694403878-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1006.676494] nova-conductor[52020]: Traceback (most recent call last): [ 1006.676494] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1006.676494] nova-conductor[52020]: return func(*args, **kwargs) [ 1006.676494] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1006.676494] nova-conductor[52020]: selections = self._select_destinations( [ 1006.676494] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1006.676494] nova-conductor[52020]: selections = self._schedule( [ 1006.676494] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1006.676494] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 1006.676494] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1006.676494] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 1006.676494] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1006.676494] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1006.677517] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-71fba461-e8a8-46ab-99bf-6c6664f0a400 tempest-ServersNegativeTestMultiTenantJSON-1694403878 tempest-ServersNegativeTestMultiTenantJSON-1694403878-project-member] [instance: 1088641d-dcb7-4fb3-9fce-6edf044eba09] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager [None req-bf2c9584-667e-4a30-b9b2-83f3ae89c328 tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1014.763747] nova-conductor[52019]: Traceback (most recent call last): [ 1014.763747] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1014.763747] nova-conductor[52019]: return func(*args, **kwargs) [ 1014.763747] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1014.763747] nova-conductor[52019]: selections = self._select_destinations( [ 1014.763747] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1014.763747] nova-conductor[52019]: selections = self._schedule( [ 1014.763747] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1014.763747] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 1014.763747] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1014.763747] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 1014.763747] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager [ 1014.763747] nova-conductor[52019]: ERROR nova.conductor.manager [ 1014.770048] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-bf2c9584-667e-4a30-b9b2-83f3ae89c328 tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1014.770273] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-bf2c9584-667e-4a30-b9b2-83f3ae89c328 tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1014.770444] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-bf2c9584-667e-4a30-b9b2-83f3ae89c328 tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1014.806140] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-bf2c9584-667e-4a30-b9b2-83f3ae89c328 tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] [instance: 23647652-d8e1-4f83-bb50-3d408d62741c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1014.806782] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-bf2c9584-667e-4a30-b9b2-83f3ae89c328 tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1014.806983] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-bf2c9584-667e-4a30-b9b2-83f3ae89c328 tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1014.807163] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-bf2c9584-667e-4a30-b9b2-83f3ae89c328 tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1014.809850] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-bf2c9584-667e-4a30-b9b2-83f3ae89c328 tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1014.809850] nova-conductor[52019]: Traceback (most recent call last): [ 1014.809850] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1014.809850] nova-conductor[52019]: return func(*args, **kwargs) [ 1014.809850] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1014.809850] nova-conductor[52019]: selections = self._select_destinations( [ 1014.809850] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1014.809850] nova-conductor[52019]: selections = self._schedule( [ 1014.809850] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1014.809850] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 1014.809850] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1014.809850] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 1014.809850] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1014.809850] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1014.810398] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-bf2c9584-667e-4a30-b9b2-83f3ae89c328 tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] [instance: 23647652-d8e1-4f83-bb50-3d408d62741c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager [None req-8549088e-5c13-4b75-892f-77f05583768e tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1017.464339] nova-conductor[52020]: Traceback (most recent call last): [ 1017.464339] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1017.464339] nova-conductor[52020]: return func(*args, **kwargs) [ 1017.464339] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1017.464339] nova-conductor[52020]: selections = self._select_destinations( [ 1017.464339] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1017.464339] nova-conductor[52020]: selections = self._schedule( [ 1017.464339] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1017.464339] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 1017.464339] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1017.464339] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 1017.464339] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager [ 1017.464339] nova-conductor[52020]: ERROR nova.conductor.manager [ 1017.471377] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-8549088e-5c13-4b75-892f-77f05583768e tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1017.471594] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-8549088e-5c13-4b75-892f-77f05583768e tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1017.471762] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-8549088e-5c13-4b75-892f-77f05583768e tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1017.511035] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-8549088e-5c13-4b75-892f-77f05583768e tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] [instance: c334cdf9-22bd-402c-a474-dd33deaca038] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1017.511359] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-8549088e-5c13-4b75-892f-77f05583768e tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1017.511596] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-8549088e-5c13-4b75-892f-77f05583768e tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1017.511769] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-8549088e-5c13-4b75-892f-77f05583768e tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1017.514710] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-8549088e-5c13-4b75-892f-77f05583768e tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1017.514710] nova-conductor[52020]: Traceback (most recent call last): [ 1017.514710] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1017.514710] nova-conductor[52020]: return func(*args, **kwargs) [ 1017.514710] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1017.514710] nova-conductor[52020]: selections = self._select_destinations( [ 1017.514710] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1017.514710] nova-conductor[52020]: selections = self._schedule( [ 1017.514710] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1017.514710] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 1017.514710] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1017.514710] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 1017.514710] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1017.514710] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1017.515241] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-8549088e-5c13-4b75-892f-77f05583768e tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] [instance: c334cdf9-22bd-402c-a474-dd33deaca038] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager [None req-48cf75e3-f7b2-4076-9367-2162178c24f3 tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1020.053308] nova-conductor[52019]: Traceback (most recent call last): [ 1020.053308] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1020.053308] nova-conductor[52019]: return func(*args, **kwargs) [ 1020.053308] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1020.053308] nova-conductor[52019]: selections = self._select_destinations( [ 1020.053308] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1020.053308] nova-conductor[52019]: selections = self._schedule( [ 1020.053308] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1020.053308] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 1020.053308] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1020.053308] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 1020.053308] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager [ 1020.053308] nova-conductor[52019]: ERROR nova.conductor.manager [ 1020.059955] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-48cf75e3-f7b2-4076-9367-2162178c24f3 tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1020.060247] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-48cf75e3-f7b2-4076-9367-2162178c24f3 tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1020.060422] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-48cf75e3-f7b2-4076-9367-2162178c24f3 tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1020.160879] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-48cf75e3-f7b2-4076-9367-2162178c24f3 tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] [instance: 53cdf163-fffd-4b5c-883c-1e38139a08f0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1020.161757] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-48cf75e3-f7b2-4076-9367-2162178c24f3 tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1020.162035] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-48cf75e3-f7b2-4076-9367-2162178c24f3 tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1020.162309] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-48cf75e3-f7b2-4076-9367-2162178c24f3 tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1020.165114] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-48cf75e3-f7b2-4076-9367-2162178c24f3 tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1020.165114] nova-conductor[52019]: Traceback (most recent call last): [ 1020.165114] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1020.165114] nova-conductor[52019]: return func(*args, **kwargs) [ 1020.165114] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1020.165114] nova-conductor[52019]: selections = self._select_destinations( [ 1020.165114] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1020.165114] nova-conductor[52019]: selections = self._schedule( [ 1020.165114] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1020.165114] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 1020.165114] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1020.165114] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 1020.165114] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1020.165114] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1020.165639] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-48cf75e3-f7b2-4076-9367-2162178c24f3 tempest-AttachVolumeNegativeTest-2016871791 tempest-AttachVolumeNegativeTest-2016871791-project-member] [instance: 53cdf163-fffd-4b5c-883c-1e38139a08f0] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1028.132920] nova-conductor[52019]: ERROR nova.scheduler.utils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 2545ca35-7a3f-47ed-b0de-e1bb26967379 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 1028.134026] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Rescheduling: True {{(pid=52019) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 1028.134026] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 2545ca35-7a3f-47ed-b0de-e1bb26967379.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 2545ca35-7a3f-47ed-b0de-e1bb26967379. [ 1028.134026] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 2545ca35-7a3f-47ed-b0de-e1bb26967379. [ 1028.152567] nova-conductor[52019]: DEBUG nova.network.neutron [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] deallocate_for_instance() {{(pid=52019) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1028.273024] nova-conductor[52019]: DEBUG nova.network.neutron [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Instance cache missing network info. {{(pid=52019) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1028.276383] nova-conductor[52019]: DEBUG nova.network.neutron [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Updating instance_info_cache with network_info: [] {{(pid=52019) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1028.288500] nova-conductor[52019]: DEBUG nova.db.main.api [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Created instance_extra for 2545ca35-7a3f-47ed-b0de-e1bb26967379 {{(pid=52019) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1048.921991] nova-conductor[52019]: DEBUG nova.db.main.api [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Created instance_extra for 06d5ac6a-7734-46e3-80c5-d960821b7552 {{(pid=52019) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1078.570738] nova-conductor[52019]: ERROR nova.scheduler.utils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 71554abb-780c-4681-909f-8ff93712c82e was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 1078.571204] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Rescheduling: True {{(pid=52019) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 1078.571412] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 71554abb-780c-4681-909f-8ff93712c82e.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 71554abb-780c-4681-909f-8ff93712c82e. [ 1078.571626] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 71554abb-780c-4681-909f-8ff93712c82e. [ 1078.590601] nova-conductor[52019]: DEBUG nova.network.neutron [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] deallocate_for_instance() {{(pid=52019) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1078.608463] nova-conductor[52019]: DEBUG nova.network.neutron [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Instance cache missing network info. {{(pid=52019) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1078.612062] nova-conductor[52019]: DEBUG nova.network.neutron [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Updating instance_info_cache with network_info: [] {{(pid=52019) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1098.105750] nova-conductor[52019]: ERROR nova.scheduler.utils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 64bc3ac9-57b4-4f50-97fa-ba684c1595b4 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 1098.107232] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Rescheduling: True {{(pid=52019) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 1098.107489] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 64bc3ac9-57b4-4f50-97fa-ba684c1595b4.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 64bc3ac9-57b4-4f50-97fa-ba684c1595b4. [ 1098.107705] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 64bc3ac9-57b4-4f50-97fa-ba684c1595b4. [ 1098.130058] nova-conductor[52019]: DEBUG nova.network.neutron [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] deallocate_for_instance() {{(pid=52019) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1098.147968] nova-conductor[52019]: DEBUG nova.network.neutron [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Instance cache missing network info. {{(pid=52019) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1098.151220] nova-conductor[52019]: DEBUG nova.network.neutron [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Updating instance_info_cache with network_info: [] {{(pid=52019) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager [None req-0e57ed56-4035-430a-b349-3cb81c717232 tempest-ServersAdmin275Test-1353520415 tempest-ServersAdmin275Test-1353520415-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1100.017560] nova-conductor[52020]: Traceback (most recent call last): [ 1100.017560] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1100.017560] nova-conductor[52020]: return func(*args, **kwargs) [ 1100.017560] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1100.017560] nova-conductor[52020]: selections = self._select_destinations( [ 1100.017560] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1100.017560] nova-conductor[52020]: selections = self._schedule( [ 1100.017560] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1100.017560] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 1100.017560] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1100.017560] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 1100.017560] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager result = self.transport._send( [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager raise result [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager selections = self._schedule( [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager [ 1100.017560] nova-conductor[52020]: ERROR nova.conductor.manager [ 1100.024307] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-0e57ed56-4035-430a-b349-3cb81c717232 tempest-ServersAdmin275Test-1353520415 tempest-ServersAdmin275Test-1353520415-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1100.024527] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-0e57ed56-4035-430a-b349-3cb81c717232 tempest-ServersAdmin275Test-1353520415 tempest-ServersAdmin275Test-1353520415-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1100.024700] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-0e57ed56-4035-430a-b349-3cb81c717232 tempest-ServersAdmin275Test-1353520415 tempest-ServersAdmin275Test-1353520415-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1100.061096] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-0e57ed56-4035-430a-b349-3cb81c717232 tempest-ServersAdmin275Test-1353520415 tempest-ServersAdmin275Test-1353520415-project-member] [instance: c66cf6fc-7b50-4e55-b4f1-da50cf1241e3] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1100.061776] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-0e57ed56-4035-430a-b349-3cb81c717232 tempest-ServersAdmin275Test-1353520415 tempest-ServersAdmin275Test-1353520415-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1100.061984] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-0e57ed56-4035-430a-b349-3cb81c717232 tempest-ServersAdmin275Test-1353520415 tempest-ServersAdmin275Test-1353520415-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1100.062173] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-0e57ed56-4035-430a-b349-3cb81c717232 tempest-ServersAdmin275Test-1353520415 tempest-ServersAdmin275Test-1353520415-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1100.064803] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-0e57ed56-4035-430a-b349-3cb81c717232 tempest-ServersAdmin275Test-1353520415 tempest-ServersAdmin275Test-1353520415-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1100.064803] nova-conductor[52020]: Traceback (most recent call last): [ 1100.064803] nova-conductor[52020]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1100.064803] nova-conductor[52020]: return func(*args, **kwargs) [ 1100.064803] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1100.064803] nova-conductor[52020]: selections = self._select_destinations( [ 1100.064803] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1100.064803] nova-conductor[52020]: selections = self._schedule( [ 1100.064803] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1100.064803] nova-conductor[52020]: self._ensure_sufficient_hosts( [ 1100.064803] nova-conductor[52020]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1100.064803] nova-conductor[52020]: raise exception.NoValidHost(reason=reason) [ 1100.064803] nova-conductor[52020]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1100.064803] nova-conductor[52020]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1100.065318] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-0e57ed56-4035-430a-b349-3cb81c717232 tempest-ServersAdmin275Test-1353520415 tempest-ServersAdmin275Test-1353520415-project-member] [instance: c66cf6fc-7b50-4e55-b4f1-da50cf1241e3] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager [None req-987f4cac-ecf1-4ff4-b2cc-f3e893124f23 tempest-ServerRescueTestJSON-1134944411 tempest-ServerRescueTestJSON-1134944411-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1104.678270] nova-conductor[52019]: Traceback (most recent call last): [ 1104.678270] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1104.678270] nova-conductor[52019]: return func(*args, **kwargs) [ 1104.678270] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1104.678270] nova-conductor[52019]: selections = self._select_destinations( [ 1104.678270] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1104.678270] nova-conductor[52019]: selections = self._schedule( [ 1104.678270] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1104.678270] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 1104.678270] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1104.678270] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 1104.678270] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager [ 1104.678270] nova-conductor[52019]: ERROR nova.conductor.manager [ 1104.685627] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-987f4cac-ecf1-4ff4-b2cc-f3e893124f23 tempest-ServerRescueTestJSON-1134944411 tempest-ServerRescueTestJSON-1134944411-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1104.686324] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-987f4cac-ecf1-4ff4-b2cc-f3e893124f23 tempest-ServerRescueTestJSON-1134944411 tempest-ServerRescueTestJSON-1134944411-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1104.687472] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-987f4cac-ecf1-4ff4-b2cc-f3e893124f23 tempest-ServerRescueTestJSON-1134944411 tempest-ServerRescueTestJSON-1134944411-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1104.727274] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-987f4cac-ecf1-4ff4-b2cc-f3e893124f23 tempest-ServerRescueTestJSON-1134944411 tempest-ServerRescueTestJSON-1134944411-project-member] [instance: bdf617dc-33e3-462d-a1e0-3b9767204eb6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1104.728096] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-987f4cac-ecf1-4ff4-b2cc-f3e893124f23 tempest-ServerRescueTestJSON-1134944411 tempest-ServerRescueTestJSON-1134944411-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1104.728321] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-987f4cac-ecf1-4ff4-b2cc-f3e893124f23 tempest-ServerRescueTestJSON-1134944411 tempest-ServerRescueTestJSON-1134944411-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1104.728489] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-987f4cac-ecf1-4ff4-b2cc-f3e893124f23 tempest-ServerRescueTestJSON-1134944411 tempest-ServerRescueTestJSON-1134944411-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1104.731699] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-987f4cac-ecf1-4ff4-b2cc-f3e893124f23 tempest-ServerRescueTestJSON-1134944411 tempest-ServerRescueTestJSON-1134944411-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1104.731699] nova-conductor[52019]: Traceback (most recent call last): [ 1104.731699] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1104.731699] nova-conductor[52019]: return func(*args, **kwargs) [ 1104.731699] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1104.731699] nova-conductor[52019]: selections = self._select_destinations( [ 1104.731699] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1104.731699] nova-conductor[52019]: selections = self._schedule( [ 1104.731699] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1104.731699] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 1104.731699] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1104.731699] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 1104.731699] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1104.731699] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1104.732279] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-987f4cac-ecf1-4ff4-b2cc-f3e893124f23 tempest-ServerRescueTestJSON-1134944411 tempest-ServerRescueTestJSON-1134944411-project-member] [instance: bdf617dc-33e3-462d-a1e0-3b9767204eb6] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1127.998067] nova-conductor[52019]: ERROR nova.scheduler.utils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 789a3358-bc70-44a5-bb2f-4fc2f1ff9116 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 1127.998700] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Rescheduling: True {{(pid=52019) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 1127.998926] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 789a3358-bc70-44a5-bb2f-4fc2f1ff9116.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 789a3358-bc70-44a5-bb2f-4fc2f1ff9116. [ 1127.999151] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 789a3358-bc70-44a5-bb2f-4fc2f1ff9116. [ 1128.408919] nova-conductor[52020]: DEBUG nova.db.main.api [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Created instance_extra for 238825ed-3715-444c-be7c-f42f3884df7c {{(pid=52020) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager [None req-21416bcf-5866-4fb3-8727-740404c0ce19 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1146.759778] nova-conductor[52019]: Traceback (most recent call last): [ 1146.759778] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1146.759778] nova-conductor[52019]: return func(*args, **kwargs) [ 1146.759778] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1146.759778] nova-conductor[52019]: selections = self._select_destinations( [ 1146.759778] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1146.759778] nova-conductor[52019]: selections = self._schedule( [ 1146.759778] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1146.759778] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 1146.759778] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1146.759778] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 1146.759778] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager result = self.transport._send( [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager raise result [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager selections = self._schedule( [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager [ 1146.759778] nova-conductor[52019]: ERROR nova.conductor.manager [ 1146.765871] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21416bcf-5866-4fb3-8727-740404c0ce19 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1146.766098] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21416bcf-5866-4fb3-8727-740404c0ce19 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1146.766270] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21416bcf-5866-4fb3-8727-740404c0ce19 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1146.811279] nova-conductor[52019]: DEBUG nova.conductor.manager [None req-21416bcf-5866-4fb3-8727-740404c0ce19 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 29f6b4fa-7df1-49ba-94ad-ee3dc3a9126d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52019) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1146.812081] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21416bcf-5866-4fb3-8727-740404c0ce19 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1146.812314] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21416bcf-5866-4fb3-8727-740404c0ce19 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1146.812484] nova-conductor[52019]: DEBUG oslo_concurrency.lockutils [None req-21416bcf-5866-4fb3-8727-740404c0ce19 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52019) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1146.815376] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-21416bcf-5866-4fb3-8727-740404c0ce19 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1146.815376] nova-conductor[52019]: Traceback (most recent call last): [ 1146.815376] nova-conductor[52019]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1146.815376] nova-conductor[52019]: return func(*args, **kwargs) [ 1146.815376] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1146.815376] nova-conductor[52019]: selections = self._select_destinations( [ 1146.815376] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1146.815376] nova-conductor[52019]: selections = self._schedule( [ 1146.815376] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1146.815376] nova-conductor[52019]: self._ensure_sufficient_hosts( [ 1146.815376] nova-conductor[52019]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1146.815376] nova-conductor[52019]: raise exception.NoValidHost(reason=reason) [ 1146.815376] nova-conductor[52019]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1146.815376] nova-conductor[52019]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1146.815956] nova-conductor[52019]: WARNING nova.scheduler.utils [None req-21416bcf-5866-4fb3-8727-740404c0ce19 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 29f6b4fa-7df1-49ba-94ad-ee3dc3a9126d] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1147.942687] nova-conductor[52019]: DEBUG nova.db.main.api [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Created instance_extra for 2e622c9d-369c-4c36-a477-3237bea4cf7c {{(pid=52019) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1149.354810] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Took 0.12 seconds to select destinations for 1 instance(s). {{(pid=52020) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 1149.365676] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1149.365893] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1149.366077] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1149.388849] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1149.389083] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1149.389266] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1149.389609] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1149.389796] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1149.389950] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1149.398217] nova-conductor[52020]: DEBUG nova.quota [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Getting quotas for project d4ef55cbd57248dbb887968a4efde03b. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 1149.400541] nova-conductor[52020]: DEBUG nova.quota [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Getting quotas for user d469ed909114450e86057d08dd15d305 and project d4ef55cbd57248dbb887968a4efde03b. Resources: {'instances', 'ram', 'cores'} {{(pid=52020) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 1149.406167] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52020) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 1149.406637] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1149.406836] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1149.407008] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1149.411662] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='a816e082-61f0-4ffa-a214-1bf6bd197f53',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52020) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1149.412292] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1149.412497] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1149.412653] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1149.425541] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1149.425735] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1149.425901] nova-conductor[52020]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52020) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1195.891761] nova-conductor[52020]: DEBUG nova.db.main.api [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Created instance_extra for 9745bc90-6927-46a9-af48-df69046dc2a2 {{(pid=52020) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1243.069693] nova-conductor[52020]: ERROR nova.scheduler.utils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 3c337acb-ce39-44b5-a898-b40d7f4d5234 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 1243.070292] nova-conductor[52020]: DEBUG nova.conductor.manager [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Rescheduling: True {{(pid=52020) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 1243.070564] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 3c337acb-ce39-44b5-a898-b40d7f4d5234.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 3c337acb-ce39-44b5-a898-b40d7f4d5234. [ 1243.070879] nova-conductor[52020]: WARNING nova.scheduler.utils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 3c337acb-ce39-44b5-a898-b40d7f4d5234. [ 1243.093345] nova-conductor[52020]: DEBUG nova.network.neutron [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] deallocate_for_instance() {{(pid=52020) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1243.137289] nova-conductor[52020]: DEBUG nova.network.neutron [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Instance cache missing network info. {{(pid=52020) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1243.140465] nova-conductor[52020]: DEBUG nova.network.neutron [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Updating instance_info_cache with network_info: [] {{(pid=52020) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}}