[ 495.976094] nova-conductor[52620]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 497.180769] nova-conductor[52620]: DEBUG oslo_db.sqlalchemy.engines [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52620) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 497.206699] nova-conductor[52620]: DEBUG nova.context [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),d0ae54bd-e0f8-43cb-838f-391ff3e34a72(cell1) {{(pid=52620) load_cells /opt/stack/nova/nova/context.py:464}} [ 497.208607] nova-conductor[52620]: DEBUG oslo_concurrency.lockutils [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 497.208819] nova-conductor[52620]: DEBUG oslo_concurrency.lockutils [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 497.209294] nova-conductor[52620]: DEBUG oslo_concurrency.lockutils [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 497.209660] nova-conductor[52620]: DEBUG oslo_concurrency.lockutils [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 497.209841] nova-conductor[52620]: DEBUG oslo_concurrency.lockutils [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 497.210772] nova-conductor[52620]: DEBUG oslo_concurrency.lockutils [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 497.216055] nova-conductor[52620]: DEBUG oslo_db.sqlalchemy.engines [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52620) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 497.216545] nova-conductor[52620]: DEBUG oslo_db.sqlalchemy.engines [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52620) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 497.276986] nova-conductor[52620]: DEBUG oslo_concurrency.lockutils [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] Acquiring lock "singleton_lock" {{(pid=52620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 497.277211] nova-conductor[52620]: DEBUG oslo_concurrency.lockutils [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] Acquired lock "singleton_lock" {{(pid=52620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 497.277400] nova-conductor[52620]: DEBUG oslo_concurrency.lockutils [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] Releasing lock "singleton_lock" {{(pid=52620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 497.277818] nova-conductor[52620]: INFO oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] Starting 2 workers [ 497.282526] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] Started child 53039 {{(pid=52620) _start_child /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:575}} [ 497.288020] nova-conductor[53039]: INFO nova.service [-] Starting conductor node (version 0.1.0) [ 497.288313] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] Started child 53040 {{(pid=52620) _start_child /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:575}} [ 497.288313] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] Full set of CONF: {{(pid=52620) wait /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:649}} [ 497.288313] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ******************************************************************************** {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 497.288313] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] Configuration options gathered from: {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 497.288313] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] command line args: ['--config-file', '/etc/nova/nova.conf'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 497.288313] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] config files: ['/etc/nova/nova.conf'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 497.288470] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ================================================================================ {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 497.288602] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] allow_resize_to_same_host = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.288807] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] arq_binding_timeout = 300 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.289016] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] block_device_allocate_retries = 60 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.289211] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] block_device_allocate_retries_interval = 3 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.289431] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cert = self.pem {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.292127] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] compute_driver = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.292127] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] compute_monitors = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.292127] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] config_dir = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.292228] nova-conductor[53040]: INFO nova.service [-] Starting conductor node (version 0.1.0) [ 497.292411] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] config_drive_format = iso9660 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.292411] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] config_file = ['/etc/nova/nova.conf'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.292411] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] config_source = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.292411] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] console_host = devstack {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.292411] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] control_exchange = nova {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.292411] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cpu_allocation_ratio = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.292411] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] daemon = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.292583] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] debug = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.292583] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] default_access_ip_network_name = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.292583] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] default_availability_zone = nova {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.292583] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] default_ephemeral_format = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.292583] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.292749] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] default_schedule_zone = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.292992] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] disk_allocation_ratio = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.292992] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] enable_new_services = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.293209] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] enabled_apis = ['osapi_compute'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.293409] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] enabled_ssl_apis = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.293591] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] flat_injected = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.293746] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] force_config_drive = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.293921] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] force_raw_images = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.294113] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] graceful_shutdown_timeout = 5 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.294285] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] heal_instance_info_cache_interval = 60 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.294798] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] host = devstack {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.295013] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.295193] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] initial_disk_allocation_ratio = 1.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.295659] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] initial_ram_allocation_ratio = 1.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.295659] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.295781] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] instance_build_timeout = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.295951] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] instance_delete_interval = 300 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.296130] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] instance_format = [instance: %(uuid)s] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.296309] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] instance_name_template = instance-%08x {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.296479] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] instance_usage_audit = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.296666] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] instance_usage_audit_period = month {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.296843] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.297044] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] instances_path = /opt/stack/data/nova/instances {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.297205] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] internal_service_availability_zone = internal {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.297353] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] key = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.297489] nova-conductor[53039]: DEBUG oslo_db.sqlalchemy.engines [None req-738dbb38-c264-46ca-a946-d3df4292901a None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=53039) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 497.297550] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] live_migration_retry_count = 30 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.297733] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] log_config_append = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.298227] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.298227] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] log_dir = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.298338] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] log_file = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.298401] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] log_options = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.298574] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] log_rotate_interval = 1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.298761] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] log_rotate_interval_type = days {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.298937] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] log_rotation_type = none {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.299100] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.299236] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.299413] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.299605] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.299731] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.299921] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] long_rpc_timeout = 1800 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.300088] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] max_concurrent_builds = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.300264] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] max_concurrent_live_migrations = 1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.300464] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] max_concurrent_snapshots = 5 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.300629] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] max_local_block_devices = 3 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.300801] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] max_logfile_count = 30 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.300953] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] max_logfile_size_mb = 200 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.301116] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] maximum_instance_delete_attempts = 5 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.301304] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] metadata_listen = 0.0.0.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.301509] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] metadata_listen_port = 8775 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.301676] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] metadata_workers = 2 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.301828] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] migrate_max_retries = -1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.302011] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] mkisofs_cmd = genisoimage {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.302213] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] my_block_storage_ip = 10.180.1.21 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.302341] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] my_ip = 10.180.1.21 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.302494] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] network_allocate_retries = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.302671] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.302847] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] osapi_compute_listen = 0.0.0.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.303101] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] osapi_compute_listen_port = 8774 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.303197] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] osapi_compute_unique_server_name_scope = {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.303361] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] osapi_compute_workers = 2 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.303534] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] password_length = 12 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.303709] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] periodic_enable = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.303878] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] periodic_fuzzy_delay = 60 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.304374] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] pointer_model = usbtablet {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.304374] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] preallocate_images = none {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.304455] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] publish_errors = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.304529] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] pybasedir = /opt/stack/nova {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.304680] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ram_allocation_ratio = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.304837] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] rate_limit_burst = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.305023] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] rate_limit_except_level = CRITICAL {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.305208] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] rate_limit_interval = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.305379] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] reboot_timeout = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.305549] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] reclaim_instance_interval = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.305697] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] record = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.305848] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] reimage_timeout_per_gb = 20 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.306000] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] report_interval = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.306164] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] rescue_timeout = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.306317] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] reserved_host_cpus = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.306477] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] reserved_host_disk_mb = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.306625] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] reserved_host_memory_mb = 512 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.306781] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] reserved_huge_pages = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.306933] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] resize_confirm_window = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.307097] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] resize_fs_using_block_device = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.307311] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] resume_guests_state_on_host_boot = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.307437] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.307644] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] rpc_response_timeout = 60 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.309020] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] run_external_periodic_tasks = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.309020] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] running_deleted_instance_action = reap {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.309020] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] running_deleted_instance_poll_interval = 1800 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.309020] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] running_deleted_instance_timeout = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.309164] nova-conductor[53040]: DEBUG oslo_db.sqlalchemy.engines [None req-a0342772-c4e9-4348-ba43-9c68fdde1bf4 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=53040) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 497.309195] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] scheduler_instance_sync_interval = 120 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.309195] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] service_down_time = 60 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.309195] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] servicegroup_driver = db {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.309195] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] shelved_offload_time = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.309195] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] shelved_poll_interval = 3600 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.309315] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] shutdown_timeout = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.309444] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] source_is_ipv6 = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.309581] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ssl_only = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.309764] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] state_path = /opt/stack/data/nova {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.309916] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] sync_power_state_interval = 600 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.310078] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] sync_power_state_pool_size = 1000 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.310254] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] syslog_log_facility = LOG_USER {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.310428] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] tempdir = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.310583] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] timeout_nbd = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.310766] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] transport_url = **** {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.311633] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] update_resources_interval = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.311633] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] use_cow_images = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.311633] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] use_eventlog = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.311633] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] use_journal = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.311633] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] use_json = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.311885] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] use_rootwrap_daemon = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.311885] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] use_stderr = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.312008] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] use_syslog = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.312158] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vcpu_pin_set = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.312317] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vif_plugging_is_fatal = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.312495] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vif_plugging_timeout = 300 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.312695] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] virt_mkfs = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.312850] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] volume_usage_poll_interval = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.313016] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] watch_log_file = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.313249] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] web = /usr/share/spice-html5 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 497.313508] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_concurrency.disable_process_locking = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.313704] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_concurrency.lock_path = /opt/stack/data/nova {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.313906] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.314082] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.314250] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.314433] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.314840] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.314840] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.auth_strategy = keystone {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.314991] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.compute_link_prefix = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.315208] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.315379] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.dhcp_domain = novalocal {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.315570] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.enable_instance_password = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.315722] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.glance_link_prefix = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.315880] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.316072] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.316249] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.instance_list_per_project_cells = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.316407] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.list_records_by_skipping_down_cells = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.317062] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.local_metadata_per_cell = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.317062] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.max_limit = 1000 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.317062] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.metadata_cache_expiration = 15 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.317062] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.neutron_default_tenant_id = default {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.317296] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.use_forwarded_for = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.317369] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.use_neutron_default_nets = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.317567] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.317729] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.317909] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.318096] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.318260] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.vendordata_dynamic_targets = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.318418] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.vendordata_jsonfile_path = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.318613] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.318863] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.backend = dogpile.cache.memcached {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.319039] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.backend_argument = **** {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.319229] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.config_prefix = cache.oslo {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.319423] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.dead_timeout = 60.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.319598] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.debug_cache_backend = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.319773] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.enable_retry_client = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.319927] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.enable_socket_keepalive = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.320106] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.enabled = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.320283] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.expiration_time = 600 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.320477] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.hashclient_retry_attempts = 2 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.320645] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.hashclient_retry_delay = 1.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.320802] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.memcache_dead_retry = 300 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.320961] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.memcache_password = {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.321131] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.321200] nova-conductor[53039]: DEBUG nova.service [None req-738dbb38-c264-46ca-a946-d3df4292901a None None] Creating RPC server for service conductor {{(pid=53039) start /opt/stack/nova/nova/service.py:182}} [ 497.321284] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.321437] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.memcache_pool_maxsize = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.321612] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.321769] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.memcache_sasl_enabled = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.321939] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.322108] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.memcache_socket_timeout = 1.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.322269] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.memcache_username = {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.323046] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.proxies = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.323046] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.retry_attempts = 2 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.323046] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.retry_delay = 0.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.323046] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.socket_keepalive_count = 1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.323046] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.socket_keepalive_idle = 1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.323261] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.socket_keepalive_interval = 1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.323347] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.tls_allowed_ciphers = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.323533] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.tls_cafile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.323662] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.tls_certfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.323820] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.tls_enabled = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.323996] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cache.tls_keyfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.324219] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cinder.auth_section = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.324406] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cinder.auth_type = password {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.324571] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cinder.cafile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.324759] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cinder.catalog_info = volumev3::publicURL {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.324946] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cinder.certfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.325096] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cinder.collect_timing = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.325278] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cinder.cross_az_attach = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.325437] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cinder.debug = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.325598] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cinder.endpoint_template = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.325778] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cinder.http_retries = 3 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.325944] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cinder.insecure = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.326112] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cinder.keyfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.326298] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cinder.os_region_name = RegionOne {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.326529] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cinder.split_loggers = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.326628] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cinder.timeout = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.326800] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.326957] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] compute.cpu_dedicated_set = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.327122] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] compute.cpu_shared_set = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.327289] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] compute.image_type_exclude_list = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.327540] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.327737] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] compute.max_concurrent_disk_ops = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.327915] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] compute.max_disk_devices_to_attach = -1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.328086] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.328278] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.328445] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] compute.resource_provider_association_refresh = 300 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.328606] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] compute.shutdown_retry_interval = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.328784] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.328823] nova-conductor[53040]: DEBUG nova.service [None req-a0342772-c4e9-4348-ba43-9c68fdde1bf4 None None] Creating RPC server for service conductor {{(pid=53040) start /opt/stack/nova/nova/service.py:182}} [ 497.328961] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] conductor.workers = 2 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.329159] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] console.allowed_origins = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.329319] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] console.ssl_ciphers = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.329519] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] console.ssl_minimum_version = default {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.329700] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] consoleauth.token_ttl = 600 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.329892] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cyborg.cafile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.330063] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cyborg.certfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.330227] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cyborg.collect_timing = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.330382] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cyborg.connect_retries = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.330605] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cyborg.connect_retry_delay = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.330792] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cyborg.endpoint_override = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.330956] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cyborg.insecure = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.331127] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cyborg.keyfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.331287] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cyborg.max_version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.331442] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cyborg.min_version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.331605] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cyborg.region_name = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.331761] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cyborg.service_name = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.331923] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cyborg.service_type = accelerator {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.332116] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cyborg.split_loggers = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.332278] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cyborg.status_code_retries = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.332436] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cyborg.status_code_retry_delay = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.332596] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cyborg.timeout = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.332775] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.332933] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] cyborg.version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.333865] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] database.backend = sqlalchemy {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.333865] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] database.connection = **** {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.333865] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] database.connection_debug = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.333865] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] database.connection_parameters = {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.333865] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] database.connection_recycle_time = 3600 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.334082] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] database.connection_trace = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.334208] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] database.db_inc_retry_interval = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.334368] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] database.db_max_retries = 20 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.334535] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] database.db_max_retry_interval = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.334696] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] database.db_retry_interval = 1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.335120] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] database.max_overflow = 50 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.335120] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] database.max_pool_size = 5 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.335217] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] database.max_retries = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.335369] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] database.mysql_enable_ndb = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.335548] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.335681] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] database.mysql_wsrep_sync_wait = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.335834] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] database.pool_timeout = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.336199] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] database.retry_interval = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.336199] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] database.slave_connection = **** {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.336330] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] database.sqlite_synchronous = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.336492] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] database.use_db_reconnect = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.336672] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api_database.backend = sqlalchemy {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.336847] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api_database.connection = **** {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.337025] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api_database.connection_debug = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.337191] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api_database.connection_parameters = {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.337461] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api_database.connection_recycle_time = 3600 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.337550] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api_database.connection_trace = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.337684] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api_database.db_inc_retry_interval = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.338057] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api_database.db_max_retries = 20 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.338057] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api_database.db_max_retry_interval = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.338169] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api_database.db_retry_interval = 1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.338331] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api_database.max_overflow = 50 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.338488] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api_database.max_pool_size = 5 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.338650] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api_database.max_retries = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.338812] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api_database.mysql_enable_ndb = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.338976] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.339142] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.339299] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api_database.pool_timeout = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.339493] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api_database.retry_interval = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.339650] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api_database.slave_connection = **** {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.339810] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] api_database.sqlite_synchronous = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.340039] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] devices.enabled_mdev_types = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.340191] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.340349] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ephemeral_storage_encryption.enabled = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.340511] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.340712] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.api_servers = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.340874] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.cafile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.341062] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.certfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.341228] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.collect_timing = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.341383] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.connect_retries = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.341537] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.connect_retry_delay = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.341722] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.debug = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.341930] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.default_trusted_certificate_ids = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.342106] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.enable_certificate_validation = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.342272] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.enable_rbd_download = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.342409] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.endpoint_override = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.342574] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.insecure = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.343238] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.keyfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.343238] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.max_version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.343238] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.min_version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.343238] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.num_retries = 3 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.343380] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.rbd_ceph_conf = {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.343491] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.rbd_connect_timeout = 5 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.343655] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.rbd_pool = {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.344343] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.rbd_user = {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.344343] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.region_name = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.344343] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.service_name = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.344343] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.service_type = image {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.344511] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.split_loggers = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.344601] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.status_code_retries = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.344751] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.status_code_retry_delay = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.344900] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.timeout = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.345083] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.345239] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.verify_glance_signatures = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.345429] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] glance.version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.345650] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] guestfs.debug = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.345791] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] hyperv.config_drive_cdrom = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.345951] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] hyperv.config_drive_inject_password = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.346123] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.346384] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] hyperv.enable_instance_metrics_collection = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.346434] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] hyperv.enable_remotefx = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.346603] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] hyperv.instances_path_share = {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.346812] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] hyperv.iscsi_initiator_list = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.346905] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] hyperv.limit_cpu_features = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.347531] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.347531] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.347531] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] hyperv.power_state_check_timeframe = 60 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.347531] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.347721] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.347876] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] hyperv.use_multipath_io = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.348044] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] hyperv.volume_attach_retry_count = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.348197] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.348352] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] hyperv.vswitch_name = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.348520] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.348787] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] mks.enabled = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.349280] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.349552] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] image_cache.manager_interval = 2400 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.349757] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] image_cache.precache_concurrency = 1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.349915] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] image_cache.remove_unused_base_images = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.350090] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.350261] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.350620] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] image_cache.subdirectory_name = _base {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.350620] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.api_max_retries = 60 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.351308] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.api_retry_interval = 2 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.351308] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.auth_section = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.351308] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.auth_type = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.351308] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.cafile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.351468] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.certfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.351546] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.collect_timing = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.351717] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.connect_retries = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.351870] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.connect_retry_delay = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.352028] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.endpoint_override = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.352184] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.insecure = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.352335] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.keyfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.352485] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.max_version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.352636] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.min_version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.352854] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.partition_key = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.352940] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.peer_list = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.353179] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.region_name = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.353281] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.serial_console_state_timeout = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.353539] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.service_name = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.353599] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.service_type = baremetal {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.353749] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.split_loggers = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.353931] nova-conductor[53039]: DEBUG nova.service [None req-738dbb38-c264-46ca-a946-d3df4292901a None None] Join ServiceGroup membership for this service conductor {{(pid=53039) start /opt/stack/nova/nova/service.py:199}} [ 497.353984] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.status_code_retries = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.354034] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.status_code_retry_delay = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.354091] nova-conductor[53039]: DEBUG nova.servicegroup.drivers.db [None req-738dbb38-c264-46ca-a946-d3df4292901a None None] DB_Driver: join new ServiceGroup member devstack to the conductor group, service = {{(pid=53039) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 497.354328] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.timeout = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.354391] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.354673] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ironic.version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.354746] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.354936] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] key_manager.fixed_key = **** {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.355177] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.355352] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican.barbican_api_version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.355509] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican.barbican_endpoint = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.355672] nova-conductor[53040]: DEBUG nova.service [None req-a0342772-c4e9-4348-ba43-9c68fdde1bf4 None None] Join ServiceGroup membership for this service conductor {{(pid=53040) start /opt/stack/nova/nova/service.py:199}} [ 497.355705] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican.barbican_endpoint_type = public {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.355855] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican.barbican_region_name = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.355910] nova-conductor[53040]: DEBUG nova.servicegroup.drivers.db [None req-a0342772-c4e9-4348-ba43-9c68fdde1bf4 None None] DB_Driver: join new ServiceGroup member devstack to the conductor group, service = {{(pid=53040) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 497.356020] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican.cafile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.356179] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican.certfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.356366] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican.collect_timing = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.356487] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican.insecure = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.356636] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican.keyfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.356818] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican.number_of_retries = 60 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.356993] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican.retry_delay = 1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.357195] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican.send_service_user_token = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.357359] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican.split_loggers = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.357544] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican.timeout = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.357713] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican.verify_ssl = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.357869] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican.verify_ssl_path = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.358069] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican_service_user.auth_section = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.358249] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican_service_user.auth_type = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.358406] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican_service_user.cafile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.358561] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican_service_user.certfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.358718] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican_service_user.collect_timing = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.358874] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican_service_user.insecure = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.359037] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican_service_user.keyfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.359201] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican_service_user.split_loggers = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.359357] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] barbican_service_user.timeout = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.359565] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vault.approle_role_id = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.359727] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vault.approle_secret_id = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.359886] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vault.cafile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.360052] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vault.certfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.360213] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vault.collect_timing = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.360407] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vault.insecure = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.360570] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vault.keyfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.360711] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vault.kv_mountpoint = secret {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.360881] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vault.kv_version = 2 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.361562] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vault.namespace = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.361727] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vault.root_token_id = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.361887] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vault.split_loggers = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.362052] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vault.ssl_ca_crt_file = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.362215] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vault.timeout = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.362373] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vault.use_ssl = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.362556] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.362752] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] keystone.cafile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.362934] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] keystone.certfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.363124] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] keystone.collect_timing = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.363284] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] keystone.connect_retries = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.363440] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] keystone.connect_retry_delay = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.363594] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] keystone.endpoint_override = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.363758] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] keystone.insecure = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.363909] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] keystone.keyfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.364072] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] keystone.max_version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.364224] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] keystone.min_version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.364376] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] keystone.region_name = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.364526] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] keystone.service_name = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.364696] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] keystone.service_type = identity {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.364853] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] keystone.split_loggers = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.365013] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] keystone.status_code_retries = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.365172] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] keystone.status_code_retry_delay = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.365323] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] keystone.timeout = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.365524] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.365684] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] keystone.version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.365918] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.connection_uri = {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.366110] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.cpu_mode = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.366278] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.cpu_model_extra_flags = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.366445] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.cpu_models = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.366615] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.cpu_power_governor_high = performance {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.366779] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.cpu_power_governor_low = powersave {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.366937] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.cpu_power_management = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.367134] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.367314] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.device_detach_attempts = 8 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.367494] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.device_detach_timeout = 20 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.367664] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.disk_cachemodes = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.367823] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.disk_prefix = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.368011] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.enabled_perf_events = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.368176] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.file_backed_memory = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.368338] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.gid_maps = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.368511] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.hw_disk_discard = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.368668] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.hw_machine_type = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.368837] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.images_rbd_ceph_conf = {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.368991] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.369163] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.369323] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.images_rbd_glance_store_name = {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.369512] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.images_rbd_pool = rbd {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.369684] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.images_type = default {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.369839] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.images_volume_group = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.369999] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.inject_key = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.370171] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.inject_partition = -2 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.370330] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.inject_password = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.370519] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.iscsi_iface = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.370694] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.iser_use_multipath = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.370839] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.live_migration_bandwidth = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.371011] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.371176] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.live_migration_downtime = 500 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.371333] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.371488] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.371644] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.live_migration_inbound_addr = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.371804] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.371960] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.live_migration_permit_post_copy = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.372126] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.live_migration_scheme = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.372299] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.live_migration_timeout_action = abort {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.372468] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.live_migration_tunnelled = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.372616] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.live_migration_uri = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.372777] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.live_migration_with_native_tls = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.372951] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.max_queues = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.373121] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.373297] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.nfs_mount_options = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.373636] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.nfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.373812] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.373978] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.num_iser_scan_tries = 5 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.374146] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.num_memory_encrypted_guests = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.374305] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.374462] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.num_pcie_ports = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.374624] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.num_volume_scan_tries = 5 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.374837] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.pmem_namespaces = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.374997] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.quobyte_client_cfg = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.375248] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/nova/mnt {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.375416] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.rbd_connect_timeout = 5 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.375578] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.375741] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.375897] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.rbd_secret_uuid = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.376061] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.rbd_user = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.376226] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.376412] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.remote_filesystem_transport = ssh {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.376572] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.rescue_image_id = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.376730] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.rescue_kernel_id = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.376884] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.rescue_ramdisk_id = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.377056] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.377213] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.rx_queue_size = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.377423] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.smbfs_mount_options = {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.377705] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.377883] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.snapshot_compression = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.378060] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.snapshot_image_format = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.378283] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.378451] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.sparse_logical_volumes = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.378613] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.swtpm_enabled = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.378781] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.swtpm_group = tss {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.378949] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.swtpm_user = tss {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.379129] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.sysinfo_serial = unique {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.379287] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.tx_queue_size = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.379474] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.uid_maps = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.379641] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.use_virtio_for_bridges = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.379809] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.virt_type = kvm {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.379974] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.volume_clear = zero {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.380144] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.volume_clear_size = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.380309] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.volume_use_multipath = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.380471] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.vzstorage_cache_path = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.380630] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.380813] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.vzstorage_mount_group = qemu {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.380947] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.vzstorage_mount_opts = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.381125] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.381349] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/nova/mnt {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.381520] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.vzstorage_mount_user = stack {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.381685] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.381850] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.auth_section = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.382030] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.auth_type = password {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.382193] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.cafile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.382350] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.certfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.382508] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.collect_timing = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.382664] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.connect_retries = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.382817] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.connect_retry_delay = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.383021] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.default_floating_pool = public {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.383182] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.endpoint_override = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.383342] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.extension_sync_interval = 600 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.383499] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.http_retries = 3 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.383657] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.insecure = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.383812] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.keyfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.383970] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.max_version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.384147] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.384303] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.min_version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.384464] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.ovs_bridge = br-int {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.384625] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.physnets = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.384789] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.region_name = RegionOne {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.384953] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.service_metadata_proxy = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.385121] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.service_name = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.385291] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.service_type = network {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.385450] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.split_loggers = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.385606] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.status_code_retries = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.385764] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.status_code_retry_delay = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.385922] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.timeout = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.386109] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.386270] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] neutron.version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.386443] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] notifications.bdms_in_notifications = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.386619] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] notifications.default_level = INFO {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.386794] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] notifications.notification_format = unversioned {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.386953] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] notifications.notify_on_state_change = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.387139] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.387342] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] pci.alias = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.387536] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] pci.device_spec = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.387706] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] pci.report_in_placement = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.387898] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.auth_section = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.388086] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.auth_type = password {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.388283] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.388443] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.cafile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.388600] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.certfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.388761] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.collect_timing = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.388915] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.connect_retries = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.389079] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.connect_retry_delay = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.389236] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.default_domain_id = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.389393] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.default_domain_name = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.389569] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.domain_id = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.389729] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.domain_name = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.389883] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.endpoint_override = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.390052] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.insecure = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.390209] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.keyfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.390363] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.max_version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.390528] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.min_version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.390706] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.password = **** {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.390868] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.project_domain_id = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.391043] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.project_domain_name = Default {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.391217] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.project_id = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.391389] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.project_name = service {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.391554] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.region_name = RegionOne {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.391726] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.service_name = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.391875] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.service_type = placement {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.392044] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.split_loggers = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.392201] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.status_code_retries = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.392354] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.status_code_retry_delay = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.392508] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.system_scope = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.392662] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.timeout = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.392813] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.trust_id = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.392969] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.user_domain_id = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.393151] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.user_domain_name = Default {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.393310] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.user_id = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.393483] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.username = placement {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.393702] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.393862] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] placement.version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.394065] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] quota.cores = 20 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.394231] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] quota.count_usage_from_placement = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.394399] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.394586] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] quota.injected_file_content_bytes = 10240 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.394751] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] quota.injected_file_path_length = 255 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.394913] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] quota.injected_files = 5 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.395175] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] quota.instances = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.395250] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] quota.key_pairs = 100 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.395409] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] quota.metadata_items = 128 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.395570] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] quota.ram = 51200 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.395731] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] quota.recheck_quota = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.395891] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] quota.server_group_members = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.396066] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] quota.server_groups = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.396232] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] rdp.enabled = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.396539] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.396749] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.396950] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.397149] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] scheduler.image_metadata_prefilter = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.397338] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.397549] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] scheduler.max_attempts = 3 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.397772] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] scheduler.max_placement_results = 1000 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.397962] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.398160] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] scheduler.query_placement_for_availability_zone = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.398337] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] scheduler.query_placement_for_image_type_support = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.398514] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.398708] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] scheduler.workers = 2 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.398900] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.399103] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.399311] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.399511] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.399677] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.399868] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.400039] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.400248] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.400415] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.host_subset_size = 1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.400579] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.400739] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.400911] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.isolated_hosts = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.401105] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.isolated_images = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.401271] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.401430] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.401593] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.pci_in_placement = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.401751] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.401909] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.402077] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.402235] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.402398] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.402559] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.402738] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.track_instance_changes = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.402922] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.403111] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] metrics.required = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.403275] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] metrics.weight_multiplier = 1.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.403437] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.403601] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] metrics.weight_setting = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.403902] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.404086] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] serial_console.enabled = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.404279] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] serial_console.port_range = 10000:20000 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.404447] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.404612] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.404794] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] serial_console.serialproxy_port = 6083 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.404956] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] service_user.auth_section = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.405137] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] service_user.auth_type = password {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.405292] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] service_user.cafile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.405446] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] service_user.certfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.405605] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] service_user.collect_timing = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.405759] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] service_user.insecure = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.405910] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] service_user.keyfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.406092] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] service_user.send_service_user_token = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.406264] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] service_user.split_loggers = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.406420] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] service_user.timeout = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.406584] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] spice.agent_enabled = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.406766] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] spice.enabled = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.407099] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.407329] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.407520] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] spice.html5proxy_port = 6082 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.407691] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] spice.image_compression = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.407865] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] spice.jpeg_compression = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.408042] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] spice.playback_compression = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.408218] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] spice.server_listen = 127.0.0.1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.408381] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.408542] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] spice.streaming_mode = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.408700] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] spice.zlib_compression = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.408884] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] upgrade_levels.baseapi = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.409053] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] upgrade_levels.cert = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.409254] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] upgrade_levels.compute = auto {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.409435] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] upgrade_levels.conductor = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.409605] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] upgrade_levels.scheduler = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.409771] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vendordata_dynamic_auth.auth_section = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.409949] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vendordata_dynamic_auth.auth_type = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.410119] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vendordata_dynamic_auth.cafile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.410277] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vendordata_dynamic_auth.certfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.410437] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.410595] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vendordata_dynamic_auth.insecure = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.410747] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vendordata_dynamic_auth.keyfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.410903] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.411067] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vendordata_dynamic_auth.timeout = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.411267] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.api_retry_count = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.411426] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.ca_file = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.411599] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.cache_prefix = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.411752] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.cluster_name = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.411910] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.connection_pool_size = 10 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.412072] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.console_delay_seconds = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.412227] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.datastore_regex = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.412385] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.host_ip = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.412544] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.host_password = **** {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.412716] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.host_port = 443 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.412871] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.host_username = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.413037] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.insecure = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.413198] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.integration_bridge = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.413361] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.maximum_objects = 100 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.413520] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.pbm_default_policy = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.413680] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.pbm_enabled = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.413887] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.pbm_wsdl_location = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.414087] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.414249] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.serial_port_proxy_uri = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.414407] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.serial_port_service_uri = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.414572] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.task_poll_interval = 0.5 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.414734] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.use_linked_clone = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.414898] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.vnc_keymap = en-us {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.415073] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.vnc_port = 5900 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.415237] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vmware.vnc_port_total = 10000 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.415458] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vnc.auth_schemes = ['none'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.415645] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vnc.enabled = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.415961] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.416158] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.416328] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vnc.novncproxy_port = 6080 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.416506] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vnc.server_listen = 127.0.0.1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.416677] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.416836] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vnc.vencrypt_ca_certs = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.417008] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vnc.vencrypt_client_cert = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.417164] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] vnc.vencrypt_client_key = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.417385] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.417570] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.417735] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.417892] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.418061] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] workarounds.disable_rootwrap = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.418219] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] workarounds.enable_numa_live_migration = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.418374] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.418529] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.418686] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.418838] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] workarounds.libvirt_disable_apic = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.418994] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.419171] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.419329] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.419511] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.419680] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.419840] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.419996] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.420167] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.420325] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.420491] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.420694] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.420862] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] wsgi.client_socket_timeout = 900 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.421036] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] wsgi.default_pool_size = 1000 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.421202] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] wsgi.keep_alive = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.421371] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] wsgi.max_header_line = 16384 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.421531] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] wsgi.secure_proxy_ssl_header = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.421694] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] wsgi.ssl_ca_file = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.421851] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] wsgi.ssl_cert_file = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.422014] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] wsgi.ssl_key_file = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.422184] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] wsgi.tcp_keepidle = 600 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.422358] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.422522] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] zvm.ca_file = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.422683] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] zvm.cloud_connector_url = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.422903] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] zvm.image_tmp_path = /opt/stack/data/nova/images {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.423095] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] zvm.reachable_timeout = 300 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.423326] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_policy.enforce_new_defaults = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.423517] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_policy.enforce_scope = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.423723] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_policy.policy_default_rule = default {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.423933] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.424136] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_policy.policy_file = policy.yaml {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.424325] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.424502] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.424662] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.424836] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.424997] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.425198] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.425381] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.425595] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] profiler.connection_string = messaging:// {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.425767] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] profiler.enabled = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.425950] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] profiler.es_doc_type = notification {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.426140] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] profiler.es_scroll_size = 10000 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.426310] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] profiler.es_scroll_time = 2m {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.426474] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] profiler.filter_error_trace = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.426642] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] profiler.hmac_keys = SECRET_KEY {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.426805] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] profiler.sentinel_service_name = mymaster {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.426994] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] profiler.socket_timeout = 0.1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.427171] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] profiler.trace_sqlalchemy = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.427365] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] remote_debug.host = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.427567] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] remote_debug.port = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.427815] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.427988] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.428170] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.428330] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.428496] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.428660] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.428822] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.428984] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.429159] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.429318] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.429519] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.429696] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.429871] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.430048] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.430212] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.430389] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.430568] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.430741] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.430915] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.431089] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.431252] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.431423] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.431571] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.431796] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.431975] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.432168] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.ssl = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.432328] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.432495] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.432658] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.432822] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.432987] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_rabbit.ssl_version = {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.433210] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.433382] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_notifications.retry = -1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.433581] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.433841] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_messaging_notifications.transport_url = **** {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.434135] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.auth_section = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.434317] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.auth_type = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.434477] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.cafile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.434635] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.certfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.434797] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.collect_timing = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.434956] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.connect_retries = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.435123] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.connect_retry_delay = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.435303] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.endpoint_id = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.435461] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.endpoint_override = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.435620] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.insecure = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.435776] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.keyfile = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.435930] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.max_version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.436092] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.min_version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.436255] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.region_name = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.436427] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.service_name = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.436584] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.service_type = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.436739] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.split_loggers = False {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.436892] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.status_code_retries = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.437054] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.status_code_retry_delay = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.437207] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.timeout = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.437359] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.valid_interfaces = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.437536] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_limit.version = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.437745] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_reports.file_event_handler = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.437920] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.438086] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] oslo_reports.log_dir = None {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 497.438269] nova-conductor[52620]: DEBUG oslo_service.service [None req-1382e3e4-eaa0-41e2-8850-69fad6b74117 None None] ******************************************************************************** {{(pid=52620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 582.419227] nova-conductor[53040]: DEBUG oslo_db.sqlalchemy.engines [None req-12d99d6d-3f2d-4de0-89fa-a37a6ba45ef5 None None] Parent process 52620 forked (53040) with an open database connection, which is being discarded and recreated. {{(pid=53040) checkout /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:434}} [ 625.153109] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Took 0.87 seconds to select destinations for 1 instance(s). {{(pid=53040) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 625.190550] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.190863] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.192648] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.201707] nova-conductor[53040]: DEBUG oslo_db.sqlalchemy.engines [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=53040) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 625.286933] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.286933] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.004s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.286933] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.286933] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.287102] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.287102] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.295388] nova-conductor[53040]: DEBUG oslo_db.sqlalchemy.engines [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=53040) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 625.309538] nova-conductor[53040]: DEBUG nova.quota [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Getting quotas for project d7324cc23ca541659f0f82bd61038ec5. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 625.317329] nova-conductor[53040]: DEBUG nova.quota [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Getting quotas for user c7081d000b0e4ff8ab58084f5d0f9f41 and project d7324cc23ca541659f0f82bd61038ec5. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 625.322223] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53040) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 625.322799] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.323013] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.323182] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.334292] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 625.334292] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.334292] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.334654] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.365153] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.365153] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.365153] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.365153] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Acquiring lock "compute-rpcapi-router" {{(pid=53040) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 625.365318] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Acquired lock "compute-rpcapi-router" {{(pid=53040) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 625.366226] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-354032de-04f4-45ba-89d7-6217293af7e8 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.366226] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-354032de-04f4-45ba-89d7-6217293af7e8 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.366226] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-354032de-04f4-45ba-89d7-6217293af7e8 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.366663] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-354032de-04f4-45ba-89d7-6217293af7e8 None None] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.366663] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-354032de-04f4-45ba-89d7-6217293af7e8 None None] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.368561] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-354032de-04f4-45ba-89d7-6217293af7e8 None None] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.375938] nova-conductor[53040]: INFO nova.compute.rpcapi [None req-354032de-04f4-45ba-89d7-6217293af7e8 None None] Automatically selected compute RPC version 6.2 from minimum service version 66 [ 625.375938] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-354032de-04f4-45ba-89d7-6217293af7e8 None None] Releasing lock "compute-rpcapi-router" {{(pid=53040) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 627.133109] nova-conductor[53039]: DEBUG oslo_db.sqlalchemy.engines [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Parent process 52620 forked (53039) with an open database connection, which is being discarded and recreated. {{(pid=53039) checkout /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:434}} [ 627.380035] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Took 0.24 seconds to select destinations for 1 instance(s). {{(pid=53039) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 627.408887] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.409331] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.411073] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.417171] nova-conductor[53039]: DEBUG oslo_db.sqlalchemy.engines [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=53039) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 627.499517] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.499667] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.500140] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.500534] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.500750] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.500915] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.510661] nova-conductor[53039]: DEBUG oslo_db.sqlalchemy.engines [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=53039) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 627.527677] nova-conductor[53039]: DEBUG nova.quota [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Getting quotas for project a7835207110449299e6f867f379be296. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 627.531393] nova-conductor[53039]: DEBUG nova.quota [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Getting quotas for user d23e0a3670f64e449edc8f3bfdee61c7 and project a7835207110449299e6f867f379be296. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 627.539486] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53039) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 627.539881] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.540095] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.540256] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.544052] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 627.544747] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.544942] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.545118] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.569036] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.569036] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.569036] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.569284] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Acquiring lock "compute-rpcapi-router" {{(pid=53039) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 627.570134] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Acquired lock "compute-rpcapi-router" {{(pid=53039) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 627.570134] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2500f665-7db3-4078-9784-c0dab2d36b82 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.570134] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2500f665-7db3-4078-9784-c0dab2d36b82 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.570297] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2500f665-7db3-4078-9784-c0dab2d36b82 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.570642] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2500f665-7db3-4078-9784-c0dab2d36b82 None None] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.570866] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2500f665-7db3-4078-9784-c0dab2d36b82 None None] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.570968] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2500f665-7db3-4078-9784-c0dab2d36b82 None None] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.578601] nova-conductor[53039]: INFO nova.compute.rpcapi [None req-2500f665-7db3-4078-9784-c0dab2d36b82 None None] Automatically selected compute RPC version 6.2 from minimum service version 66 [ 627.579071] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2500f665-7db3-4078-9784-c0dab2d36b82 None None] Releasing lock "compute-rpcapi-router" {{(pid=53039) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 629.293235] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Took 0.18 seconds to select destinations for 1 instance(s). {{(pid=53040) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 629.314487] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 629.315044] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 629.315493] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 629.370355] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 629.370588] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 629.370759] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 629.371741] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 629.372519] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 629.373323] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 629.393169] nova-conductor[53040]: DEBUG nova.quota [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Getting quotas for project 367973317efd4063b56c3f337ad62856. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 629.394827] nova-conductor[53040]: DEBUG nova.quota [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Getting quotas for user 4d6907f55ce847c2908c56f082eb622a and project 367973317efd4063b56c3f337ad62856. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 629.402664] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53040) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 629.402664] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 629.402664] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 629.403325] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 629.407994] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 629.408636] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 629.408832] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 629.409227] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 629.437707] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 629.437707] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 629.437707] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.100678] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=53039) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 631.118354] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.118799] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.119858] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.163377] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.163377] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.163377] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.163377] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.163705] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.163705] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.176511] nova-conductor[53039]: DEBUG nova.quota [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Getting quotas for project d62d59712e2f4a1db623289edd1f497a. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 631.178632] nova-conductor[53039]: DEBUG nova.quota [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Getting quotas for user 933d0db8f7d6467692158e28db97f69e and project d62d59712e2f4a1db623289edd1f497a. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 631.183959] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53039) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 631.184558] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.184759] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.184924] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.187739] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 631.188396] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.188594] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.188756] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.209609] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.210105] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.210299] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.744605] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Took 0.17 seconds to select destinations for 1 instance(s). {{(pid=53039) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 631.762366] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.762503] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.762734] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.799639] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.799944] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.800198] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.800611] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.800850] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.801061] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.811258] nova-conductor[53039]: DEBUG nova.quota [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Getting quotas for project f8c98a74b81e4f658a4a0ee9e97d8ab0. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 631.813324] nova-conductor[53039]: DEBUG nova.quota [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Getting quotas for user 0768a2b288954d3fa18c861d45c577d5 and project f8c98a74b81e4f658a4a0ee9e97d8ab0. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 631.818907] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53039) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 631.819486] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.819748] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.819973] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.823505] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 631.823888] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.824141] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.824346] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.837650] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.837972] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.838129] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 633.663664] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=53039) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 633.676323] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.676557] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.676737] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 633.709598] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.709954] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.710060] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 633.710422] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.710549] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.710737] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 633.719186] nova-conductor[53039]: DEBUG nova.quota [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Getting quotas for project 6e5c7d3736204e8eafce9963fa2a28eb. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 633.721440] nova-conductor[53039]: DEBUG nova.quota [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Getting quotas for user fde1ca8282c8470092a272f43a40b55d and project 6e5c7d3736204e8eafce9963fa2a28eb. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 633.728037] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53039) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 633.728347] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.728558] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.728726] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 633.731594] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 633.732265] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.732428] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.732631] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 633.746059] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.746274] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.746438] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 634.279322] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Took 0.17 seconds to select destinations for 1 instance(s). {{(pid=53039) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 634.292061] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 634.292262] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 634.292390] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 634.344592] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 634.347598] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 634.347598] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 634.347598] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 634.347598] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 634.348856] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 634.358254] nova-conductor[53039]: DEBUG nova.quota [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Getting quotas for project 69b9135b39df41b49fbd80c72a9cab5c. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 634.361833] nova-conductor[53039]: DEBUG nova.quota [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Getting quotas for user b3a409f2c61a48f784c6b761b1ff1309 and project 69b9135b39df41b49fbd80c72a9cab5c. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 634.376143] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53039) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 634.376954] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 634.377633] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 634.377698] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 634.383018] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 634.383018] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 634.383018] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 634.383276] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 634.401159] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 634.401377] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 634.401591] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 647.149338] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Took 0.22 seconds to select destinations for 1 instance(s). {{(pid=53039) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 647.167520] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 647.167520] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 647.167812] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 647.202217] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 647.203032] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 647.203279] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 647.203684] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 647.203930] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 647.204124] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 647.213718] nova-conductor[53039]: DEBUG nova.quota [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Getting quotas for project c2d9ebe5ad8545b9ade33e18b6092a47. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 647.216583] nova-conductor[53039]: DEBUG nova.quota [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Getting quotas for user 56c77841712b4c63b744211d28c87fb7 and project c2d9ebe5ad8545b9ade33e18b6092a47. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 647.223830] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53039) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 647.224363] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 647.224796] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 647.225048] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 647.228850] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 647.229546] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 647.229751] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 647.230524] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 647.249775] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 647.250016] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 647.250196] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 647.534701] nova-conductor[53039]: ERROR nova.conductor.manager [None req-2b2ce50d-585e-4321-bbe7-49c554f15ba5 tempest-VolumesAdminNegativeTest-957263333 tempest-VolumesAdminNegativeTest-957263333-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 647.534701] nova-conductor[53039]: Traceback (most recent call last): [ 647.534701] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 647.534701] nova-conductor[53039]: return func(*args, **kwargs) [ 647.534701] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 647.534701] nova-conductor[53039]: selections = self._select_destinations( [ 647.534701] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 647.534701] nova-conductor[53039]: selections = self._schedule( [ 647.534701] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 647.534701] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 647.534701] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 647.534701] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 647.534701] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 647.534701] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 647.534701] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 647.534701] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 647.534701] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 647.534701] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 647.534701] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 647.534701] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 647.534701] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 647.535452] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 647.535452] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 647.535452] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 647.535452] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 647.535452] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 647.535452] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 647.535452] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 647.535452] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 647.535452] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 647.535452] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 647.535452] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 647.535452] nova-conductor[53039]: ERROR nova.conductor.manager [ 647.535452] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 647.535452] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 647.535452] nova-conductor[53039]: ERROR nova.conductor.manager [ 647.535452] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 647.535452] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 647.535452] nova-conductor[53039]: ERROR nova.conductor.manager [ 647.535452] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 647.535452] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 647.535452] nova-conductor[53039]: ERROR nova.conductor.manager [ 647.536039] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 647.536039] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 647.536039] nova-conductor[53039]: ERROR nova.conductor.manager [ 647.536039] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 647.536039] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 647.536039] nova-conductor[53039]: ERROR nova.conductor.manager [ 647.536039] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 647.536039] nova-conductor[53039]: ERROR nova.conductor.manager [ 647.536039] nova-conductor[53039]: ERROR nova.conductor.manager [ 647.547307] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2b2ce50d-585e-4321-bbe7-49c554f15ba5 tempest-VolumesAdminNegativeTest-957263333 tempest-VolumesAdminNegativeTest-957263333-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 647.549861] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2b2ce50d-585e-4321-bbe7-49c554f15ba5 tempest-VolumesAdminNegativeTest-957263333 tempest-VolumesAdminNegativeTest-957263333-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.002s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 647.549861] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2b2ce50d-585e-4321-bbe7-49c554f15ba5 tempest-VolumesAdminNegativeTest-957263333 tempest-VolumesAdminNegativeTest-957263333-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 647.651125] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-2b2ce50d-585e-4321-bbe7-49c554f15ba5 tempest-VolumesAdminNegativeTest-957263333 tempest-VolumesAdminNegativeTest-957263333-project-member] [instance: 5ad3e093-b7cb-4fe8-b533-506e1f3a92ea] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 647.651125] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2b2ce50d-585e-4321-bbe7-49c554f15ba5 tempest-VolumesAdminNegativeTest-957263333 tempest-VolumesAdminNegativeTest-957263333-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 647.651125] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2b2ce50d-585e-4321-bbe7-49c554f15ba5 tempest-VolumesAdminNegativeTest-957263333 tempest-VolumesAdminNegativeTest-957263333-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 647.651310] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2b2ce50d-585e-4321-bbe7-49c554f15ba5 tempest-VolumesAdminNegativeTest-957263333 tempest-VolumesAdminNegativeTest-957263333-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 647.656236] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-2b2ce50d-585e-4321-bbe7-49c554f15ba5 tempest-VolumesAdminNegativeTest-957263333 tempest-VolumesAdminNegativeTest-957263333-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 647.656236] nova-conductor[53039]: Traceback (most recent call last): [ 647.656236] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 647.656236] nova-conductor[53039]: return func(*args, **kwargs) [ 647.656236] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 647.656236] nova-conductor[53039]: selections = self._select_destinations( [ 647.656236] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 647.656236] nova-conductor[53039]: selections = self._schedule( [ 647.656236] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 647.656236] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 647.656236] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 647.656236] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 647.656236] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 647.656236] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 647.657522] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-2b2ce50d-585e-4321-bbe7-49c554f15ba5 tempest-VolumesAdminNegativeTest-957263333 tempest-VolumesAdminNegativeTest-957263333-project-member] [instance: 5ad3e093-b7cb-4fe8-b533-506e1f3a92ea] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 649.303218] nova-conductor[53040]: ERROR nova.conductor.manager [None req-67a57505-9e51-428d-9dba-0f99d54e687b tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 649.303218] nova-conductor[53040]: Traceback (most recent call last): [ 649.303218] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 649.303218] nova-conductor[53040]: return func(*args, **kwargs) [ 649.303218] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 649.303218] nova-conductor[53040]: selections = self._select_destinations( [ 649.303218] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 649.303218] nova-conductor[53040]: selections = self._schedule( [ 649.303218] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 649.303218] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 649.303218] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 649.303218] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 649.303218] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 649.303218] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 649.303218] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 649.303218] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 649.303218] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 649.303218] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 649.303218] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 649.303218] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 649.303218] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 649.303935] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 649.303935] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 649.303935] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 649.303935] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 649.303935] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 649.303935] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 649.303935] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 649.303935] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 649.303935] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 649.303935] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 649.303935] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 649.303935] nova-conductor[53040]: ERROR nova.conductor.manager [ 649.303935] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 649.303935] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 649.303935] nova-conductor[53040]: ERROR nova.conductor.manager [ 649.303935] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 649.303935] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 649.303935] nova-conductor[53040]: ERROR nova.conductor.manager [ 649.303935] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 649.303935] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 649.303935] nova-conductor[53040]: ERROR nova.conductor.manager [ 649.304474] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 649.304474] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 649.304474] nova-conductor[53040]: ERROR nova.conductor.manager [ 649.304474] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 649.304474] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 649.304474] nova-conductor[53040]: ERROR nova.conductor.manager [ 649.304474] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 649.304474] nova-conductor[53040]: ERROR nova.conductor.manager [ 649.304474] nova-conductor[53040]: ERROR nova.conductor.manager [ 649.313854] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-67a57505-9e51-428d-9dba-0f99d54e687b tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 649.313854] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-67a57505-9e51-428d-9dba-0f99d54e687b tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 649.313854] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-67a57505-9e51-428d-9dba-0f99d54e687b tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 649.377684] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-67a57505-9e51-428d-9dba-0f99d54e687b tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] [instance: b2a0923f-6583-4c4d-9601-5cebd8191c51] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 649.378104] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-67a57505-9e51-428d-9dba-0f99d54e687b tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 649.378104] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-67a57505-9e51-428d-9dba-0f99d54e687b tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 649.378247] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-67a57505-9e51-428d-9dba-0f99d54e687b tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 649.382701] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-67a57505-9e51-428d-9dba-0f99d54e687b tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 649.382701] nova-conductor[53040]: Traceback (most recent call last): [ 649.382701] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 649.382701] nova-conductor[53040]: return func(*args, **kwargs) [ 649.382701] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 649.382701] nova-conductor[53040]: selections = self._select_destinations( [ 649.382701] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 649.382701] nova-conductor[53040]: selections = self._schedule( [ 649.382701] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 649.382701] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 649.382701] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 649.382701] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 649.382701] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 649.382701] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 649.383693] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-67a57505-9e51-428d-9dba-0f99d54e687b tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] [instance: b2a0923f-6583-4c4d-9601-5cebd8191c51] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.977213] nova-conductor[53040]: ERROR nova.conductor.manager [None req-4dc4536c-b7ad-457a-b343-5791b908b95b tempest-ServersAdminTestJSON-1829072578 tempest-ServersAdminTestJSON-1829072578-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.977213] nova-conductor[53040]: Traceback (most recent call last): [ 650.977213] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.977213] nova-conductor[53040]: return func(*args, **kwargs) [ 650.977213] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.977213] nova-conductor[53040]: selections = self._select_destinations( [ 650.977213] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.977213] nova-conductor[53040]: selections = self._schedule( [ 650.977213] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.977213] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 650.977213] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.977213] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 650.977213] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.977213] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 650.977213] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 650.977213] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 650.977213] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 650.977213] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 650.977213] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 650.977213] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 650.977213] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 650.978065] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 650.978065] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 650.978065] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 650.978065] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 650.978065] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 650.978065] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 650.978065] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 650.978065] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 650.978065] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 650.978065] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.978065] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 650.978065] nova-conductor[53040]: ERROR nova.conductor.manager [ 650.978065] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.978065] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 650.978065] nova-conductor[53040]: ERROR nova.conductor.manager [ 650.978065] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.978065] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 650.978065] nova-conductor[53040]: ERROR nova.conductor.manager [ 650.978065] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.978065] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 650.978065] nova-conductor[53040]: ERROR nova.conductor.manager [ 650.978870] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.978870] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 650.978870] nova-conductor[53040]: ERROR nova.conductor.manager [ 650.978870] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.978870] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 650.978870] nova-conductor[53040]: ERROR nova.conductor.manager [ 650.978870] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.978870] nova-conductor[53040]: ERROR nova.conductor.manager [ 650.978870] nova-conductor[53040]: ERROR nova.conductor.manager [ 650.988500] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-4dc4536c-b7ad-457a-b343-5791b908b95b tempest-ServersAdminTestJSON-1829072578 tempest-ServersAdminTestJSON-1829072578-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 650.990623] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-4dc4536c-b7ad-457a-b343-5791b908b95b tempest-ServersAdminTestJSON-1829072578 tempest-ServersAdminTestJSON-1829072578-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 650.990623] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-4dc4536c-b7ad-457a-b343-5791b908b95b tempest-ServersAdminTestJSON-1829072578 tempest-ServersAdminTestJSON-1829072578-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 651.071039] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-4dc4536c-b7ad-457a-b343-5791b908b95b tempest-ServersAdminTestJSON-1829072578 tempest-ServersAdminTestJSON-1829072578-project-member] [instance: 7eb2e403-767f-4253-8173-d1d1e50a2a7b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 651.071039] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-4dc4536c-b7ad-457a-b343-5791b908b95b tempest-ServersAdminTestJSON-1829072578 tempest-ServersAdminTestJSON-1829072578-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 651.071039] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-4dc4536c-b7ad-457a-b343-5791b908b95b tempest-ServersAdminTestJSON-1829072578 tempest-ServersAdminTestJSON-1829072578-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 651.071299] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-4dc4536c-b7ad-457a-b343-5791b908b95b tempest-ServersAdminTestJSON-1829072578 tempest-ServersAdminTestJSON-1829072578-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 651.073699] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-4dc4536c-b7ad-457a-b343-5791b908b95b tempest-ServersAdminTestJSON-1829072578 tempest-ServersAdminTestJSON-1829072578-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 651.073699] nova-conductor[53040]: Traceback (most recent call last): [ 651.073699] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 651.073699] nova-conductor[53040]: return func(*args, **kwargs) [ 651.073699] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 651.073699] nova-conductor[53040]: selections = self._select_destinations( [ 651.073699] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 651.073699] nova-conductor[53040]: selections = self._schedule( [ 651.073699] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 651.073699] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 651.073699] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 651.073699] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 651.073699] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 651.073699] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 651.074263] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-4dc4536c-b7ad-457a-b343-5791b908b95b tempest-ServersAdminTestJSON-1829072578 tempest-ServersAdminTestJSON-1829072578-project-member] [instance: 7eb2e403-767f-4253-8173-d1d1e50a2a7b] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 652.016528] nova-conductor[53039]: ERROR nova.conductor.manager [None req-2896a89d-3b19-4e62-9572-11111bf832be tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 652.016528] nova-conductor[53039]: Traceback (most recent call last): [ 652.016528] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 652.016528] nova-conductor[53039]: return func(*args, **kwargs) [ 652.016528] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 652.016528] nova-conductor[53039]: selections = self._select_destinations( [ 652.016528] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 652.016528] nova-conductor[53039]: selections = self._schedule( [ 652.016528] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 652.016528] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 652.016528] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 652.016528] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 652.016528] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 652.016528] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 652.016528] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 652.016528] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 652.016528] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 652.016528] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 652.016528] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 652.016528] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 652.016528] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 652.017314] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 652.017314] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 652.017314] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 652.017314] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 652.017314] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 652.017314] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 652.017314] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 652.017314] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 652.017314] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 652.017314] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 652.017314] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 652.017314] nova-conductor[53039]: ERROR nova.conductor.manager [ 652.017314] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 652.017314] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 652.017314] nova-conductor[53039]: ERROR nova.conductor.manager [ 652.017314] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 652.017314] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 652.017314] nova-conductor[53039]: ERROR nova.conductor.manager [ 652.017314] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 652.017314] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 652.017314] nova-conductor[53039]: ERROR nova.conductor.manager [ 652.018435] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 652.018435] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 652.018435] nova-conductor[53039]: ERROR nova.conductor.manager [ 652.018435] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 652.018435] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 652.018435] nova-conductor[53039]: ERROR nova.conductor.manager [ 652.018435] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 652.018435] nova-conductor[53039]: ERROR nova.conductor.manager [ 652.018435] nova-conductor[53039]: ERROR nova.conductor.manager [ 652.023584] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2896a89d-3b19-4e62-9572-11111bf832be tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 652.023806] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2896a89d-3b19-4e62-9572-11111bf832be tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 652.024249] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2896a89d-3b19-4e62-9572-11111bf832be tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 652.074393] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-2896a89d-3b19-4e62-9572-11111bf832be tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] [instance: d63c231c-dd64-41b2-82d7-e1b190d37582] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 652.075158] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2896a89d-3b19-4e62-9572-11111bf832be tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 652.075415] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2896a89d-3b19-4e62-9572-11111bf832be tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 652.075618] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2896a89d-3b19-4e62-9572-11111bf832be tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 652.078848] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-2896a89d-3b19-4e62-9572-11111bf832be tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 652.078848] nova-conductor[53039]: Traceback (most recent call last): [ 652.078848] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 652.078848] nova-conductor[53039]: return func(*args, **kwargs) [ 652.078848] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 652.078848] nova-conductor[53039]: selections = self._select_destinations( [ 652.078848] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 652.078848] nova-conductor[53039]: selections = self._schedule( [ 652.078848] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 652.078848] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 652.078848] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 652.078848] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 652.078848] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 652.078848] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 652.079788] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-2896a89d-3b19-4e62-9572-11111bf832be tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] [instance: d63c231c-dd64-41b2-82d7-e1b190d37582] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 653.820734] nova-conductor[53040]: ERROR nova.conductor.manager [None req-dfdd3304-7b0e-4d9a-8d02-7ebec4019725 tempest-ListImageFiltersTestJSON-941715649 tempest-ListImageFiltersTestJSON-941715649-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 653.820734] nova-conductor[53040]: Traceback (most recent call last): [ 653.820734] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 653.820734] nova-conductor[53040]: return func(*args, **kwargs) [ 653.820734] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 653.820734] nova-conductor[53040]: selections = self._select_destinations( [ 653.820734] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 653.820734] nova-conductor[53040]: selections = self._schedule( [ 653.820734] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 653.820734] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 653.820734] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 653.820734] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 653.820734] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 653.820734] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 653.820734] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 653.820734] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 653.820734] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 653.820734] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 653.820734] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 653.820734] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 653.820734] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 653.821452] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 653.821452] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 653.821452] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 653.821452] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 653.821452] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 653.821452] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 653.821452] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 653.821452] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 653.821452] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 653.821452] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 653.821452] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 653.821452] nova-conductor[53040]: ERROR nova.conductor.manager [ 653.821452] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 653.821452] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 653.821452] nova-conductor[53040]: ERROR nova.conductor.manager [ 653.821452] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 653.821452] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 653.821452] nova-conductor[53040]: ERROR nova.conductor.manager [ 653.821452] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 653.821452] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 653.821452] nova-conductor[53040]: ERROR nova.conductor.manager [ 653.822022] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 653.822022] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 653.822022] nova-conductor[53040]: ERROR nova.conductor.manager [ 653.822022] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 653.822022] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 653.822022] nova-conductor[53040]: ERROR nova.conductor.manager [ 653.822022] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 653.822022] nova-conductor[53040]: ERROR nova.conductor.manager [ 653.822022] nova-conductor[53040]: ERROR nova.conductor.manager [ 653.829988] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-dfdd3304-7b0e-4d9a-8d02-7ebec4019725 tempest-ListImageFiltersTestJSON-941715649 tempest-ListImageFiltersTestJSON-941715649-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 653.829988] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-dfdd3304-7b0e-4d9a-8d02-7ebec4019725 tempest-ListImageFiltersTestJSON-941715649 tempest-ListImageFiltersTestJSON-941715649-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 653.829988] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-dfdd3304-7b0e-4d9a-8d02-7ebec4019725 tempest-ListImageFiltersTestJSON-941715649 tempest-ListImageFiltersTestJSON-941715649-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 653.887131] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-dfdd3304-7b0e-4d9a-8d02-7ebec4019725 tempest-ListImageFiltersTestJSON-941715649 tempest-ListImageFiltersTestJSON-941715649-project-member] [instance: de5100d9-3a8d-4bc9-a9fa-7bf6f6579085] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 653.887909] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-dfdd3304-7b0e-4d9a-8d02-7ebec4019725 tempest-ListImageFiltersTestJSON-941715649 tempest-ListImageFiltersTestJSON-941715649-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 653.888583] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-dfdd3304-7b0e-4d9a-8d02-7ebec4019725 tempest-ListImageFiltersTestJSON-941715649 tempest-ListImageFiltersTestJSON-941715649-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 653.888816] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-dfdd3304-7b0e-4d9a-8d02-7ebec4019725 tempest-ListImageFiltersTestJSON-941715649 tempest-ListImageFiltersTestJSON-941715649-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 653.893171] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-dfdd3304-7b0e-4d9a-8d02-7ebec4019725 tempest-ListImageFiltersTestJSON-941715649 tempest-ListImageFiltersTestJSON-941715649-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 653.893171] nova-conductor[53040]: Traceback (most recent call last): [ 653.893171] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 653.893171] nova-conductor[53040]: return func(*args, **kwargs) [ 653.893171] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 653.893171] nova-conductor[53040]: selections = self._select_destinations( [ 653.893171] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 653.893171] nova-conductor[53040]: selections = self._schedule( [ 653.893171] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 653.893171] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 653.893171] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 653.893171] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 653.893171] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 653.893171] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 653.893846] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-dfdd3304-7b0e-4d9a-8d02-7ebec4019725 tempest-ListImageFiltersTestJSON-941715649 tempest-ListImageFiltersTestJSON-941715649-project-member] [instance: de5100d9-3a8d-4bc9-a9fa-7bf6f6579085] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 654.629049] nova-conductor[53039]: ERROR nova.conductor.manager [None req-0cd6098a-a5ae-42d1-b645-d40635f008f0 tempest-VolumesAssistedSnapshotsTest-1395115177 tempest-VolumesAssistedSnapshotsTest-1395115177-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 654.629049] nova-conductor[53039]: Traceback (most recent call last): [ 654.629049] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 654.629049] nova-conductor[53039]: return func(*args, **kwargs) [ 654.629049] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 654.629049] nova-conductor[53039]: selections = self._select_destinations( [ 654.629049] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 654.629049] nova-conductor[53039]: selections = self._schedule( [ 654.629049] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 654.629049] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 654.629049] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 654.629049] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 654.629049] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 654.629049] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 654.629049] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 654.629049] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 654.629049] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 654.629049] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 654.629049] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 654.629049] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 654.629049] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 654.629777] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 654.629777] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 654.629777] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 654.629777] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 654.629777] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 654.629777] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 654.629777] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 654.629777] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 654.629777] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 654.629777] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 654.629777] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 654.629777] nova-conductor[53039]: ERROR nova.conductor.manager [ 654.629777] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 654.629777] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 654.629777] nova-conductor[53039]: ERROR nova.conductor.manager [ 654.629777] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 654.629777] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 654.629777] nova-conductor[53039]: ERROR nova.conductor.manager [ 654.629777] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 654.629777] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 654.629777] nova-conductor[53039]: ERROR nova.conductor.manager [ 654.631402] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 654.631402] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 654.631402] nova-conductor[53039]: ERROR nova.conductor.manager [ 654.631402] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 654.631402] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 654.631402] nova-conductor[53039]: ERROR nova.conductor.manager [ 654.631402] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 654.631402] nova-conductor[53039]: ERROR nova.conductor.manager [ 654.631402] nova-conductor[53039]: ERROR nova.conductor.manager [ 654.639571] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-0cd6098a-a5ae-42d1-b645-d40635f008f0 tempest-VolumesAssistedSnapshotsTest-1395115177 tempest-VolumesAssistedSnapshotsTest-1395115177-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 654.639571] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-0cd6098a-a5ae-42d1-b645-d40635f008f0 tempest-VolumesAssistedSnapshotsTest-1395115177 tempest-VolumesAssistedSnapshotsTest-1395115177-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 654.639732] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-0cd6098a-a5ae-42d1-b645-d40635f008f0 tempest-VolumesAssistedSnapshotsTest-1395115177 tempest-VolumesAssistedSnapshotsTest-1395115177-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 654.676840] nova-conductor[53040]: ERROR nova.conductor.manager [None req-462c9a7f-051b-4b01-a024-31b4f6ced4da tempest-ServersAdminTestJSON-1829072578 tempest-ServersAdminTestJSON-1829072578-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 654.676840] nova-conductor[53040]: Traceback (most recent call last): [ 654.676840] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 654.676840] nova-conductor[53040]: return func(*args, **kwargs) [ 654.676840] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 654.676840] nova-conductor[53040]: selections = self._select_destinations( [ 654.676840] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 654.676840] nova-conductor[53040]: selections = self._schedule( [ 654.676840] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 654.676840] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 654.676840] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 654.676840] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 654.676840] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 654.676840] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 654.676840] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 654.676840] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 654.676840] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 654.676840] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 654.676840] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 654.676840] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 654.676840] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 654.677412] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 654.677412] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 654.677412] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 654.677412] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 654.677412] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 654.677412] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 654.677412] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 654.677412] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 654.677412] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 654.677412] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 654.677412] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 654.677412] nova-conductor[53040]: ERROR nova.conductor.manager [ 654.677412] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 654.677412] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 654.677412] nova-conductor[53040]: ERROR nova.conductor.manager [ 654.677412] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 654.677412] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 654.677412] nova-conductor[53040]: ERROR nova.conductor.manager [ 654.677412] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 654.677412] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 654.677412] nova-conductor[53040]: ERROR nova.conductor.manager [ 654.677926] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 654.677926] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 654.677926] nova-conductor[53040]: ERROR nova.conductor.manager [ 654.677926] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 654.677926] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 654.677926] nova-conductor[53040]: ERROR nova.conductor.manager [ 654.677926] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 654.677926] nova-conductor[53040]: ERROR nova.conductor.manager [ 654.677926] nova-conductor[53040]: ERROR nova.conductor.manager [ 654.685266] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-462c9a7f-051b-4b01-a024-31b4f6ced4da tempest-ServersAdminTestJSON-1829072578 tempest-ServersAdminTestJSON-1829072578-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 654.687329] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-462c9a7f-051b-4b01-a024-31b4f6ced4da tempest-ServersAdminTestJSON-1829072578 tempest-ServersAdminTestJSON-1829072578-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 654.687329] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-462c9a7f-051b-4b01-a024-31b4f6ced4da tempest-ServersAdminTestJSON-1829072578 tempest-ServersAdminTestJSON-1829072578-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 654.793265] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-0cd6098a-a5ae-42d1-b645-d40635f008f0 tempest-VolumesAssistedSnapshotsTest-1395115177 tempest-VolumesAssistedSnapshotsTest-1395115177-project-member] [instance: b7c0fe3d-aa58-4004-9d58-9e1987bec636] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 654.794162] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-462c9a7f-051b-4b01-a024-31b4f6ced4da tempest-ServersAdminTestJSON-1829072578 tempest-ServersAdminTestJSON-1829072578-project-member] [instance: 5df7a407-d68f-4bbe-8c67-4af1b9309073] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 654.794269] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-0cd6098a-a5ae-42d1-b645-d40635f008f0 tempest-VolumesAssistedSnapshotsTest-1395115177 tempest-VolumesAssistedSnapshotsTest-1395115177-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 654.794269] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-0cd6098a-a5ae-42d1-b645-d40635f008f0 tempest-VolumesAssistedSnapshotsTest-1395115177 tempest-VolumesAssistedSnapshotsTest-1395115177-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 654.794458] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-462c9a7f-051b-4b01-a024-31b4f6ced4da tempest-ServersAdminTestJSON-1829072578 tempest-ServersAdminTestJSON-1829072578-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 654.794609] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-0cd6098a-a5ae-42d1-b645-d40635f008f0 tempest-VolumesAssistedSnapshotsTest-1395115177 tempest-VolumesAssistedSnapshotsTest-1395115177-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 654.794647] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-462c9a7f-051b-4b01-a024-31b4f6ced4da tempest-ServersAdminTestJSON-1829072578 tempest-ServersAdminTestJSON-1829072578-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 654.794647] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-462c9a7f-051b-4b01-a024-31b4f6ced4da tempest-ServersAdminTestJSON-1829072578 tempest-ServersAdminTestJSON-1829072578-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 654.797709] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-462c9a7f-051b-4b01-a024-31b4f6ced4da tempest-ServersAdminTestJSON-1829072578 tempest-ServersAdminTestJSON-1829072578-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 654.797709] nova-conductor[53040]: Traceback (most recent call last): [ 654.797709] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 654.797709] nova-conductor[53040]: return func(*args, **kwargs) [ 654.797709] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 654.797709] nova-conductor[53040]: selections = self._select_destinations( [ 654.797709] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 654.797709] nova-conductor[53040]: selections = self._schedule( [ 654.797709] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 654.797709] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 654.797709] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 654.797709] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 654.797709] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 654.797709] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 654.798271] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-462c9a7f-051b-4b01-a024-31b4f6ced4da tempest-ServersAdminTestJSON-1829072578 tempest-ServersAdminTestJSON-1829072578-project-member] [instance: 5df7a407-d68f-4bbe-8c67-4af1b9309073] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 654.802698] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-0cd6098a-a5ae-42d1-b645-d40635f008f0 tempest-VolumesAssistedSnapshotsTest-1395115177 tempest-VolumesAssistedSnapshotsTest-1395115177-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 654.802698] nova-conductor[53039]: Traceback (most recent call last): [ 654.802698] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 654.802698] nova-conductor[53039]: return func(*args, **kwargs) [ 654.802698] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 654.802698] nova-conductor[53039]: selections = self._select_destinations( [ 654.802698] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 654.802698] nova-conductor[53039]: selections = self._schedule( [ 654.802698] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 654.802698] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 654.802698] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 654.802698] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 654.802698] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 654.802698] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 654.803266] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-0cd6098a-a5ae-42d1-b645-d40635f008f0 tempest-VolumesAssistedSnapshotsTest-1395115177 tempest-VolumesAssistedSnapshotsTest-1395115177-project-member] [instance: b7c0fe3d-aa58-4004-9d58-9e1987bec636] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 655.242849] nova-conductor[53039]: ERROR nova.conductor.manager [None req-21213ec1-b30e-4242-9731-976f2de4a0fe tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 655.242849] nova-conductor[53039]: Traceback (most recent call last): [ 655.242849] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 655.242849] nova-conductor[53039]: return func(*args, **kwargs) [ 655.242849] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 655.242849] nova-conductor[53039]: selections = self._select_destinations( [ 655.242849] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 655.242849] nova-conductor[53039]: selections = self._schedule( [ 655.242849] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 655.242849] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 655.242849] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 655.242849] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 655.242849] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 655.242849] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 655.242849] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 655.242849] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 655.242849] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 655.242849] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 655.242849] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 655.242849] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 655.242849] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 655.243432] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 655.243432] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 655.243432] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 655.243432] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 655.243432] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 655.243432] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 655.243432] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 655.243432] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 655.243432] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 655.243432] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 655.243432] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 655.243432] nova-conductor[53039]: ERROR nova.conductor.manager [ 655.243432] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 655.243432] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 655.243432] nova-conductor[53039]: ERROR nova.conductor.manager [ 655.243432] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 655.243432] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 655.243432] nova-conductor[53039]: ERROR nova.conductor.manager [ 655.243432] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 655.243432] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 655.243432] nova-conductor[53039]: ERROR nova.conductor.manager [ 655.244038] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 655.244038] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 655.244038] nova-conductor[53039]: ERROR nova.conductor.manager [ 655.244038] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 655.244038] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 655.244038] nova-conductor[53039]: ERROR nova.conductor.manager [ 655.244038] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 655.244038] nova-conductor[53039]: ERROR nova.conductor.manager [ 655.244038] nova-conductor[53039]: ERROR nova.conductor.manager [ 655.252540] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-21213ec1-b30e-4242-9731-976f2de4a0fe tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 655.252928] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-21213ec1-b30e-4242-9731-976f2de4a0fe tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 655.253215] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-21213ec1-b30e-4242-9731-976f2de4a0fe tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 655.295417] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-21213ec1-b30e-4242-9731-976f2de4a0fe tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] [instance: f589f0da-de40-408d-aa9d-9086efeb9213] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 655.296197] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-21213ec1-b30e-4242-9731-976f2de4a0fe tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 655.296333] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-21213ec1-b30e-4242-9731-976f2de4a0fe tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 655.296502] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-21213ec1-b30e-4242-9731-976f2de4a0fe tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 655.299953] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-21213ec1-b30e-4242-9731-976f2de4a0fe tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 655.299953] nova-conductor[53039]: Traceback (most recent call last): [ 655.299953] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 655.299953] nova-conductor[53039]: return func(*args, **kwargs) [ 655.299953] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 655.299953] nova-conductor[53039]: selections = self._select_destinations( [ 655.299953] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 655.299953] nova-conductor[53039]: selections = self._schedule( [ 655.299953] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 655.299953] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 655.299953] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 655.299953] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 655.299953] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 655.299953] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 655.300440] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-21213ec1-b30e-4242-9731-976f2de4a0fe tempest-ListServerFiltersTestJSON-310435375 tempest-ListServerFiltersTestJSON-310435375-project-member] [instance: f589f0da-de40-408d-aa9d-9086efeb9213] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 656.462214] nova-conductor[53040]: ERROR nova.conductor.manager [None req-d5a95e07-3f67-401e-83f0-725a0f63624f tempest-ListImageFiltersTestJSON-941715649 tempest-ListImageFiltersTestJSON-941715649-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 656.462214] nova-conductor[53040]: Traceback (most recent call last): [ 656.462214] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 656.462214] nova-conductor[53040]: return func(*args, **kwargs) [ 656.462214] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 656.462214] nova-conductor[53040]: selections = self._select_destinations( [ 656.462214] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 656.462214] nova-conductor[53040]: selections = self._schedule( [ 656.462214] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 656.462214] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 656.462214] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 656.462214] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 656.462214] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 656.462214] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 656.462214] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 656.462214] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 656.462214] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 656.462214] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 656.462214] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 656.462214] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 656.462214] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 656.466129] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 656.466129] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 656.466129] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 656.466129] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 656.466129] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 656.466129] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 656.466129] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 656.466129] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 656.466129] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 656.466129] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 656.466129] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 656.466129] nova-conductor[53040]: ERROR nova.conductor.manager [ 656.466129] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 656.466129] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 656.466129] nova-conductor[53040]: ERROR nova.conductor.manager [ 656.466129] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 656.466129] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 656.466129] nova-conductor[53040]: ERROR nova.conductor.manager [ 656.466129] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 656.466129] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 656.466129] nova-conductor[53040]: ERROR nova.conductor.manager [ 656.467259] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 656.467259] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 656.467259] nova-conductor[53040]: ERROR nova.conductor.manager [ 656.467259] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 656.467259] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 656.467259] nova-conductor[53040]: ERROR nova.conductor.manager [ 656.467259] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 656.467259] nova-conductor[53040]: ERROR nova.conductor.manager [ 656.467259] nova-conductor[53040]: ERROR nova.conductor.manager [ 656.472337] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d5a95e07-3f67-401e-83f0-725a0f63624f tempest-ListImageFiltersTestJSON-941715649 tempest-ListImageFiltersTestJSON-941715649-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 656.472599] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d5a95e07-3f67-401e-83f0-725a0f63624f tempest-ListImageFiltersTestJSON-941715649 tempest-ListImageFiltersTestJSON-941715649-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 656.472794] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d5a95e07-3f67-401e-83f0-725a0f63624f tempest-ListImageFiltersTestJSON-941715649 tempest-ListImageFiltersTestJSON-941715649-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 656.524247] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-d5a95e07-3f67-401e-83f0-725a0f63624f tempest-ListImageFiltersTestJSON-941715649 tempest-ListImageFiltersTestJSON-941715649-project-member] [instance: f0db0f1d-edb1-495d-80bc-73f75501f3c3] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 656.524990] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d5a95e07-3f67-401e-83f0-725a0f63624f tempest-ListImageFiltersTestJSON-941715649 tempest-ListImageFiltersTestJSON-941715649-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 656.525332] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d5a95e07-3f67-401e-83f0-725a0f63624f tempest-ListImageFiltersTestJSON-941715649 tempest-ListImageFiltersTestJSON-941715649-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 656.525443] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d5a95e07-3f67-401e-83f0-725a0f63624f tempest-ListImageFiltersTestJSON-941715649 tempest-ListImageFiltersTestJSON-941715649-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 656.528952] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-d5a95e07-3f67-401e-83f0-725a0f63624f tempest-ListImageFiltersTestJSON-941715649 tempest-ListImageFiltersTestJSON-941715649-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 656.528952] nova-conductor[53040]: Traceback (most recent call last): [ 656.528952] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 656.528952] nova-conductor[53040]: return func(*args, **kwargs) [ 656.528952] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 656.528952] nova-conductor[53040]: selections = self._select_destinations( [ 656.528952] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 656.528952] nova-conductor[53040]: selections = self._schedule( [ 656.528952] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 656.528952] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 656.528952] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 656.528952] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 656.528952] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 656.528952] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 656.529790] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-d5a95e07-3f67-401e-83f0-725a0f63624f tempest-ListImageFiltersTestJSON-941715649 tempest-ListImageFiltersTestJSON-941715649-project-member] [instance: f0db0f1d-edb1-495d-80bc-73f75501f3c3] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 659.728219] nova-conductor[53039]: ERROR nova.conductor.manager [None req-ed02539b-d856-4d36-b0b2-4d9ce0b4f90e tempest-VolumesAdminNegativeTest-957263333 tempest-VolumesAdminNegativeTest-957263333-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 659.728219] nova-conductor[53039]: Traceback (most recent call last): [ 659.728219] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 659.728219] nova-conductor[53039]: return func(*args, **kwargs) [ 659.728219] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 659.728219] nova-conductor[53039]: selections = self._select_destinations( [ 659.728219] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 659.728219] nova-conductor[53039]: selections = self._schedule( [ 659.728219] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 659.728219] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 659.728219] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 659.728219] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 659.728219] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 659.728219] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 659.728219] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 659.728219] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 659.728219] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 659.728219] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 659.728219] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 659.728219] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 659.728219] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 659.728983] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 659.728983] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 659.728983] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 659.728983] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 659.728983] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 659.728983] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 659.728983] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 659.728983] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 659.728983] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 659.728983] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 659.728983] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 659.728983] nova-conductor[53039]: ERROR nova.conductor.manager [ 659.728983] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 659.728983] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 659.728983] nova-conductor[53039]: ERROR nova.conductor.manager [ 659.728983] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 659.728983] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 659.728983] nova-conductor[53039]: ERROR nova.conductor.manager [ 659.728983] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 659.728983] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 659.728983] nova-conductor[53039]: ERROR nova.conductor.manager [ 659.729705] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 659.729705] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 659.729705] nova-conductor[53039]: ERROR nova.conductor.manager [ 659.729705] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 659.729705] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 659.729705] nova-conductor[53039]: ERROR nova.conductor.manager [ 659.729705] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 659.729705] nova-conductor[53039]: ERROR nova.conductor.manager [ 659.729705] nova-conductor[53039]: ERROR nova.conductor.manager [ 659.734316] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ed02539b-d856-4d36-b0b2-4d9ce0b4f90e tempest-VolumesAdminNegativeTest-957263333 tempest-VolumesAdminNegativeTest-957263333-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 659.734537] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ed02539b-d856-4d36-b0b2-4d9ce0b4f90e tempest-VolumesAdminNegativeTest-957263333 tempest-VolumesAdminNegativeTest-957263333-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 659.734709] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ed02539b-d856-4d36-b0b2-4d9ce0b4f90e tempest-VolumesAdminNegativeTest-957263333 tempest-VolumesAdminNegativeTest-957263333-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 659.786120] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-ed02539b-d856-4d36-b0b2-4d9ce0b4f90e tempest-VolumesAdminNegativeTest-957263333 tempest-VolumesAdminNegativeTest-957263333-project-member] [instance: b052cfbc-0641-4ad2-97a8-42cf1cd95eaf] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 659.786855] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ed02539b-d856-4d36-b0b2-4d9ce0b4f90e tempest-VolumesAdminNegativeTest-957263333 tempest-VolumesAdminNegativeTest-957263333-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 659.787080] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ed02539b-d856-4d36-b0b2-4d9ce0b4f90e tempest-VolumesAdminNegativeTest-957263333 tempest-VolumesAdminNegativeTest-957263333-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 659.787254] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ed02539b-d856-4d36-b0b2-4d9ce0b4f90e tempest-VolumesAdminNegativeTest-957263333 tempest-VolumesAdminNegativeTest-957263333-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 659.792106] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-ed02539b-d856-4d36-b0b2-4d9ce0b4f90e tempest-VolumesAdminNegativeTest-957263333 tempest-VolumesAdminNegativeTest-957263333-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 659.792106] nova-conductor[53039]: Traceback (most recent call last): [ 659.792106] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 659.792106] nova-conductor[53039]: return func(*args, **kwargs) [ 659.792106] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 659.792106] nova-conductor[53039]: selections = self._select_destinations( [ 659.792106] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 659.792106] nova-conductor[53039]: selections = self._schedule( [ 659.792106] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 659.792106] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 659.792106] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 659.792106] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 659.792106] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 659.792106] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 659.792106] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-ed02539b-d856-4d36-b0b2-4d9ce0b4f90e tempest-VolumesAdminNegativeTest-957263333 tempest-VolumesAdminNegativeTest-957263333-project-member] [instance: b052cfbc-0641-4ad2-97a8-42cf1cd95eaf] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 660.292925] nova-conductor[53040]: ERROR nova.conductor.manager [None req-67c3266a-d96f-4854-a86b-fd4644dfda02 tempest-ServersTestFqdnHostnames-1298183815 tempest-ServersTestFqdnHostnames-1298183815-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 660.292925] nova-conductor[53040]: Traceback (most recent call last): [ 660.292925] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 660.292925] nova-conductor[53040]: return func(*args, **kwargs) [ 660.292925] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 660.292925] nova-conductor[53040]: selections = self._select_destinations( [ 660.292925] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 660.292925] nova-conductor[53040]: selections = self._schedule( [ 660.292925] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 660.292925] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 660.292925] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 660.292925] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 660.292925] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 660.292925] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 660.292925] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 660.292925] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 660.292925] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 660.292925] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 660.292925] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 660.292925] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 660.292925] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 660.295340] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 660.295340] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 660.295340] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 660.295340] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 660.295340] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 660.295340] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 660.295340] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 660.295340] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 660.295340] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 660.295340] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 660.295340] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 660.295340] nova-conductor[53040]: ERROR nova.conductor.manager [ 660.295340] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 660.295340] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 660.295340] nova-conductor[53040]: ERROR nova.conductor.manager [ 660.295340] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 660.295340] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 660.295340] nova-conductor[53040]: ERROR nova.conductor.manager [ 660.295340] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 660.295340] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 660.295340] nova-conductor[53040]: ERROR nova.conductor.manager [ 660.295868] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 660.295868] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 660.295868] nova-conductor[53040]: ERROR nova.conductor.manager [ 660.295868] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 660.295868] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 660.295868] nova-conductor[53040]: ERROR nova.conductor.manager [ 660.295868] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 660.295868] nova-conductor[53040]: ERROR nova.conductor.manager [ 660.295868] nova-conductor[53040]: ERROR nova.conductor.manager [ 660.299424] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-67c3266a-d96f-4854-a86b-fd4644dfda02 tempest-ServersTestFqdnHostnames-1298183815 tempest-ServersTestFqdnHostnames-1298183815-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 660.299804] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-67c3266a-d96f-4854-a86b-fd4644dfda02 tempest-ServersTestFqdnHostnames-1298183815 tempest-ServersTestFqdnHostnames-1298183815-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 660.300273] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-67c3266a-d96f-4854-a86b-fd4644dfda02 tempest-ServersTestFqdnHostnames-1298183815 tempest-ServersTestFqdnHostnames-1298183815-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 660.346422] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-67c3266a-d96f-4854-a86b-fd4644dfda02 tempest-ServersTestFqdnHostnames-1298183815 tempest-ServersTestFqdnHostnames-1298183815-project-member] [instance: fc059271-47e3-4506-8c8d-fa7c9ed46f47] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 660.347128] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-67c3266a-d96f-4854-a86b-fd4644dfda02 tempest-ServersTestFqdnHostnames-1298183815 tempest-ServersTestFqdnHostnames-1298183815-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 660.347346] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-67c3266a-d96f-4854-a86b-fd4644dfda02 tempest-ServersTestFqdnHostnames-1298183815 tempest-ServersTestFqdnHostnames-1298183815-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 660.347568] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-67c3266a-d96f-4854-a86b-fd4644dfda02 tempest-ServersTestFqdnHostnames-1298183815 tempest-ServersTestFqdnHostnames-1298183815-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 660.353022] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-67c3266a-d96f-4854-a86b-fd4644dfda02 tempest-ServersTestFqdnHostnames-1298183815 tempest-ServersTestFqdnHostnames-1298183815-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 660.353022] nova-conductor[53040]: Traceback (most recent call last): [ 660.353022] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 660.353022] nova-conductor[53040]: return func(*args, **kwargs) [ 660.353022] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 660.353022] nova-conductor[53040]: selections = self._select_destinations( [ 660.353022] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 660.353022] nova-conductor[53040]: selections = self._schedule( [ 660.353022] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 660.353022] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 660.353022] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 660.353022] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 660.353022] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 660.353022] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 660.353022] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-67c3266a-d96f-4854-a86b-fd4644dfda02 tempest-ServersTestFqdnHostnames-1298183815 tempest-ServersTestFqdnHostnames-1298183815-project-member] [instance: fc059271-47e3-4506-8c8d-fa7c9ed46f47] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 661.881887] nova-conductor[53039]: ERROR nova.conductor.manager [None req-c7d4d0f0-faae-4e0b-97aa-436987e313c6 tempest-ServersWithSpecificFlavorTestJSON-1670131432 tempest-ServersWithSpecificFlavorTestJSON-1670131432-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 661.881887] nova-conductor[53039]: Traceback (most recent call last): [ 661.881887] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 661.881887] nova-conductor[53039]: return func(*args, **kwargs) [ 661.881887] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 661.881887] nova-conductor[53039]: selections = self._select_destinations( [ 661.881887] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 661.881887] nova-conductor[53039]: selections = self._schedule( [ 661.881887] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 661.881887] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 661.881887] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 661.881887] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 661.881887] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 661.881887] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 661.881887] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 661.881887] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 661.881887] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 661.881887] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 661.881887] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 661.881887] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 661.882565] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 661.882565] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 661.882565] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 661.882565] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 661.882565] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 661.882565] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 661.882565] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 661.882565] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 661.882565] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 661.882565] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 661.882565] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 661.882565] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 661.882565] nova-conductor[53039]: ERROR nova.conductor.manager [ 661.882565] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 661.882565] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 661.882565] nova-conductor[53039]: ERROR nova.conductor.manager [ 661.882565] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 661.882565] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 661.882565] nova-conductor[53039]: ERROR nova.conductor.manager [ 661.883024] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 661.883024] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 661.883024] nova-conductor[53039]: ERROR nova.conductor.manager [ 661.883024] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 661.883024] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 661.883024] nova-conductor[53039]: ERROR nova.conductor.manager [ 661.883024] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 661.883024] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 661.883024] nova-conductor[53039]: ERROR nova.conductor.manager [ 661.883024] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 661.883024] nova-conductor[53039]: ERROR nova.conductor.manager [ 661.883024] nova-conductor[53039]: ERROR nova.conductor.manager [ 661.889549] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c7d4d0f0-faae-4e0b-97aa-436987e313c6 tempest-ServersWithSpecificFlavorTestJSON-1670131432 tempest-ServersWithSpecificFlavorTestJSON-1670131432-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 661.893464] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c7d4d0f0-faae-4e0b-97aa-436987e313c6 tempest-ServersWithSpecificFlavorTestJSON-1670131432 tempest-ServersWithSpecificFlavorTestJSON-1670131432-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 661.893464] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c7d4d0f0-faae-4e0b-97aa-436987e313c6 tempest-ServersWithSpecificFlavorTestJSON-1670131432 tempest-ServersWithSpecificFlavorTestJSON-1670131432-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 661.906620] nova-conductor[53040]: ERROR nova.conductor.manager [None req-3b6ee74d-640a-4a0b-b913-57391182c19d tempest-ServersAdmin275Test-711024600 tempest-ServersAdmin275Test-711024600-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 661.906620] nova-conductor[53040]: Traceback (most recent call last): [ 661.906620] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 661.906620] nova-conductor[53040]: return func(*args, **kwargs) [ 661.906620] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 661.906620] nova-conductor[53040]: selections = self._select_destinations( [ 661.906620] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 661.906620] nova-conductor[53040]: selections = self._schedule( [ 661.906620] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 661.906620] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 661.906620] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 661.906620] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 661.906620] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 661.906620] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 661.906620] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 661.906620] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 661.906620] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 661.906620] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 661.906620] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 661.906620] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 661.906620] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 661.907969] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 661.907969] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 661.907969] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 661.907969] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 661.907969] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 661.907969] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 661.907969] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 661.907969] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 661.907969] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 661.907969] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 661.907969] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 661.907969] nova-conductor[53040]: ERROR nova.conductor.manager [ 661.907969] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 661.907969] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 661.907969] nova-conductor[53040]: ERROR nova.conductor.manager [ 661.907969] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 661.907969] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 661.907969] nova-conductor[53040]: ERROR nova.conductor.manager [ 661.907969] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 661.907969] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 661.907969] nova-conductor[53040]: ERROR nova.conductor.manager [ 661.909592] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 661.909592] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 661.909592] nova-conductor[53040]: ERROR nova.conductor.manager [ 661.909592] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 661.909592] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 661.909592] nova-conductor[53040]: ERROR nova.conductor.manager [ 661.909592] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 661.909592] nova-conductor[53040]: ERROR nova.conductor.manager [ 661.909592] nova-conductor[53040]: ERROR nova.conductor.manager [ 661.913454] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-3b6ee74d-640a-4a0b-b913-57391182c19d tempest-ServersAdmin275Test-711024600 tempest-ServersAdmin275Test-711024600-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 661.913677] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-3b6ee74d-640a-4a0b-b913-57391182c19d tempest-ServersAdmin275Test-711024600 tempest-ServersAdmin275Test-711024600-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 661.913849] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-3b6ee74d-640a-4a0b-b913-57391182c19d tempest-ServersAdmin275Test-711024600 tempest-ServersAdmin275Test-711024600-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 661.949902] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-c7d4d0f0-faae-4e0b-97aa-436987e313c6 tempest-ServersWithSpecificFlavorTestJSON-1670131432 tempest-ServersWithSpecificFlavorTestJSON-1670131432-project-member] [instance: b79fa25c-7dd3-4df8-84f6-d6c5bd6a1bcd] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 661.949902] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c7d4d0f0-faae-4e0b-97aa-436987e313c6 tempest-ServersWithSpecificFlavorTestJSON-1670131432 tempest-ServersWithSpecificFlavorTestJSON-1670131432-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 661.949902] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c7d4d0f0-faae-4e0b-97aa-436987e313c6 tempest-ServersWithSpecificFlavorTestJSON-1670131432 tempest-ServersWithSpecificFlavorTestJSON-1670131432-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 661.950182] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-c7d4d0f0-faae-4e0b-97aa-436987e313c6 tempest-ServersWithSpecificFlavorTestJSON-1670131432 tempest-ServersWithSpecificFlavorTestJSON-1670131432-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 661.955091] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-c7d4d0f0-faae-4e0b-97aa-436987e313c6 tempest-ServersWithSpecificFlavorTestJSON-1670131432 tempest-ServersWithSpecificFlavorTestJSON-1670131432-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 661.955091] nova-conductor[53039]: Traceback (most recent call last): [ 661.955091] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 661.955091] nova-conductor[53039]: return func(*args, **kwargs) [ 661.955091] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 661.955091] nova-conductor[53039]: selections = self._select_destinations( [ 661.955091] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 661.955091] nova-conductor[53039]: selections = self._schedule( [ 661.955091] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 661.955091] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 661.955091] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 661.955091] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 661.955091] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 661.955091] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 661.955091] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-c7d4d0f0-faae-4e0b-97aa-436987e313c6 tempest-ServersWithSpecificFlavorTestJSON-1670131432 tempest-ServersWithSpecificFlavorTestJSON-1670131432-project-member] [instance: b79fa25c-7dd3-4df8-84f6-d6c5bd6a1bcd] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 661.960019] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-3b6ee74d-640a-4a0b-b913-57391182c19d tempest-ServersAdmin275Test-711024600 tempest-ServersAdmin275Test-711024600-project-member] [instance: 348bfcf3-88e6-4138-92cf-8b3a58a3b4c1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 661.960019] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-3b6ee74d-640a-4a0b-b913-57391182c19d tempest-ServersAdmin275Test-711024600 tempest-ServersAdmin275Test-711024600-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 661.960019] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-3b6ee74d-640a-4a0b-b913-57391182c19d tempest-ServersAdmin275Test-711024600 tempest-ServersAdmin275Test-711024600-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 661.960294] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-3b6ee74d-640a-4a0b-b913-57391182c19d tempest-ServersAdmin275Test-711024600 tempest-ServersAdmin275Test-711024600-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 661.962837] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-3b6ee74d-640a-4a0b-b913-57391182c19d tempest-ServersAdmin275Test-711024600 tempest-ServersAdmin275Test-711024600-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 661.962837] nova-conductor[53040]: Traceback (most recent call last): [ 661.962837] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 661.962837] nova-conductor[53040]: return func(*args, **kwargs) [ 661.962837] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 661.962837] nova-conductor[53040]: selections = self._select_destinations( [ 661.962837] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 661.962837] nova-conductor[53040]: selections = self._schedule( [ 661.962837] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 661.962837] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 661.962837] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 661.962837] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 661.962837] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 661.962837] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 661.963374] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-3b6ee74d-640a-4a0b-b913-57391182c19d tempest-ServersAdmin275Test-711024600 tempest-ServersAdmin275Test-711024600-project-member] [instance: 348bfcf3-88e6-4138-92cf-8b3a58a3b4c1] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 663.871703] nova-conductor[53039]: ERROR nova.conductor.manager [None req-7e1557f6-a295-4888-a002-730942b69aba tempest-AttachInterfacesUnderV243Test-1116763719 tempest-AttachInterfacesUnderV243Test-1116763719-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 663.871703] nova-conductor[53039]: Traceback (most recent call last): [ 663.871703] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 663.871703] nova-conductor[53039]: return func(*args, **kwargs) [ 663.871703] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 663.871703] nova-conductor[53039]: selections = self._select_destinations( [ 663.871703] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 663.871703] nova-conductor[53039]: selections = self._schedule( [ 663.871703] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 663.871703] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 663.871703] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 663.871703] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 663.871703] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 663.871703] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 663.871703] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 663.871703] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 663.871703] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 663.871703] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 663.871703] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 663.871703] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 663.872471] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 663.872471] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 663.872471] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 663.872471] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 663.872471] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 663.872471] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 663.872471] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 663.872471] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 663.872471] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 663.872471] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 663.872471] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 663.872471] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 663.872471] nova-conductor[53039]: ERROR nova.conductor.manager [ 663.872471] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 663.872471] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 663.872471] nova-conductor[53039]: ERROR nova.conductor.manager [ 663.872471] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 663.872471] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 663.872471] nova-conductor[53039]: ERROR nova.conductor.manager [ 663.872944] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 663.872944] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 663.872944] nova-conductor[53039]: ERROR nova.conductor.manager [ 663.872944] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 663.872944] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 663.872944] nova-conductor[53039]: ERROR nova.conductor.manager [ 663.872944] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 663.872944] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 663.872944] nova-conductor[53039]: ERROR nova.conductor.manager [ 663.872944] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 663.872944] nova-conductor[53039]: ERROR nova.conductor.manager [ 663.872944] nova-conductor[53039]: ERROR nova.conductor.manager [ 663.882653] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-7e1557f6-a295-4888-a002-730942b69aba tempest-AttachInterfacesUnderV243Test-1116763719 tempest-AttachInterfacesUnderV243Test-1116763719-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 663.883101] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-7e1557f6-a295-4888-a002-730942b69aba tempest-AttachInterfacesUnderV243Test-1116763719 tempest-AttachInterfacesUnderV243Test-1116763719-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 663.883198] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-7e1557f6-a295-4888-a002-730942b69aba tempest-AttachInterfacesUnderV243Test-1116763719 tempest-AttachInterfacesUnderV243Test-1116763719-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 663.928415] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-7e1557f6-a295-4888-a002-730942b69aba tempest-AttachInterfacesUnderV243Test-1116763719 tempest-AttachInterfacesUnderV243Test-1116763719-project-member] [instance: 8917be5d-d8e0-41ed-aac2-eefe93808482] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 663.929149] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-7e1557f6-a295-4888-a002-730942b69aba tempest-AttachInterfacesUnderV243Test-1116763719 tempest-AttachInterfacesUnderV243Test-1116763719-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 663.929372] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-7e1557f6-a295-4888-a002-730942b69aba tempest-AttachInterfacesUnderV243Test-1116763719 tempest-AttachInterfacesUnderV243Test-1116763719-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 663.929575] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-7e1557f6-a295-4888-a002-730942b69aba tempest-AttachInterfacesUnderV243Test-1116763719 tempest-AttachInterfacesUnderV243Test-1116763719-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 663.932525] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-7e1557f6-a295-4888-a002-730942b69aba tempest-AttachInterfacesUnderV243Test-1116763719 tempest-AttachInterfacesUnderV243Test-1116763719-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 663.932525] nova-conductor[53039]: Traceback (most recent call last): [ 663.932525] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 663.932525] nova-conductor[53039]: return func(*args, **kwargs) [ 663.932525] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 663.932525] nova-conductor[53039]: selections = self._select_destinations( [ 663.932525] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 663.932525] nova-conductor[53039]: selections = self._schedule( [ 663.932525] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 663.932525] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 663.932525] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 663.932525] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 663.932525] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 663.932525] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 663.933071] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-7e1557f6-a295-4888-a002-730942b69aba tempest-AttachInterfacesUnderV243Test-1116763719 tempest-AttachInterfacesUnderV243Test-1116763719-project-member] [instance: 8917be5d-d8e0-41ed-aac2-eefe93808482] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 666.577335] nova-conductor[53040]: ERROR nova.conductor.manager [None req-34023675-2841-491d-afcf-3057f629be04 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 666.577335] nova-conductor[53040]: Traceback (most recent call last): [ 666.577335] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 666.577335] nova-conductor[53040]: return func(*args, **kwargs) [ 666.577335] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 666.577335] nova-conductor[53040]: selections = self._select_destinations( [ 666.577335] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 666.577335] nova-conductor[53040]: selections = self._schedule( [ 666.577335] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 666.577335] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 666.577335] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 666.577335] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 666.577335] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 666.577335] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 666.577335] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 666.577335] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 666.577335] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 666.577335] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 666.577335] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 666.577335] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 666.577335] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 666.578114] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 666.578114] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 666.578114] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 666.578114] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 666.578114] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 666.578114] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 666.578114] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 666.578114] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 666.578114] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 666.578114] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 666.578114] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 666.578114] nova-conductor[53040]: ERROR nova.conductor.manager [ 666.578114] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 666.578114] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 666.578114] nova-conductor[53040]: ERROR nova.conductor.manager [ 666.578114] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 666.578114] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 666.578114] nova-conductor[53040]: ERROR nova.conductor.manager [ 666.578114] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 666.578114] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 666.578114] nova-conductor[53040]: ERROR nova.conductor.manager [ 666.578697] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 666.578697] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 666.578697] nova-conductor[53040]: ERROR nova.conductor.manager [ 666.578697] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 666.578697] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 666.578697] nova-conductor[53040]: ERROR nova.conductor.manager [ 666.578697] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 666.578697] nova-conductor[53040]: ERROR nova.conductor.manager [ 666.578697] nova-conductor[53040]: ERROR nova.conductor.manager [ 666.589976] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-34023675-2841-491d-afcf-3057f629be04 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 666.590229] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-34023675-2841-491d-afcf-3057f629be04 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 666.590397] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-34023675-2841-491d-afcf-3057f629be04 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 666.652561] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-34023675-2841-491d-afcf-3057f629be04 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 09a03ec9-3269-4ade-af4b-ed49f4f10285] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 666.656951] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-34023675-2841-491d-afcf-3057f629be04 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 666.656951] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-34023675-2841-491d-afcf-3057f629be04 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 666.656951] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-34023675-2841-491d-afcf-3057f629be04 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 666.657755] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-34023675-2841-491d-afcf-3057f629be04 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 666.657755] nova-conductor[53040]: Traceback (most recent call last): [ 666.657755] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 666.657755] nova-conductor[53040]: return func(*args, **kwargs) [ 666.657755] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 666.657755] nova-conductor[53040]: selections = self._select_destinations( [ 666.657755] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 666.657755] nova-conductor[53040]: selections = self._schedule( [ 666.657755] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 666.657755] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 666.657755] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 666.657755] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 666.657755] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 666.657755] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 666.658907] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-34023675-2841-491d-afcf-3057f629be04 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 09a03ec9-3269-4ade-af4b-ed49f4f10285] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 666.670392] nova-conductor[53039]: ERROR nova.conductor.manager [None req-1a88a0d6-7b5a-4120-b197-cbeac4081cac tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 666.670392] nova-conductor[53039]: Traceback (most recent call last): [ 666.670392] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 666.670392] nova-conductor[53039]: return func(*args, **kwargs) [ 666.670392] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 666.670392] nova-conductor[53039]: selections = self._select_destinations( [ 666.670392] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 666.670392] nova-conductor[53039]: selections = self._schedule( [ 666.670392] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 666.670392] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 666.670392] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 666.670392] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 666.670392] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 666.670392] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 666.670392] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 666.670392] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 666.670392] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 666.670392] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 666.670392] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 666.670392] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 666.670392] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 666.671213] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 666.671213] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 666.671213] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 666.671213] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 666.671213] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 666.671213] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 666.671213] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 666.671213] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 666.671213] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 666.671213] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 666.671213] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 666.671213] nova-conductor[53039]: ERROR nova.conductor.manager [ 666.671213] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 666.671213] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 666.671213] nova-conductor[53039]: ERROR nova.conductor.manager [ 666.671213] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 666.671213] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 666.671213] nova-conductor[53039]: ERROR nova.conductor.manager [ 666.671213] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 666.671213] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 666.671213] nova-conductor[53039]: ERROR nova.conductor.manager [ 666.671805] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 666.671805] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 666.671805] nova-conductor[53039]: ERROR nova.conductor.manager [ 666.671805] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 666.671805] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 666.671805] nova-conductor[53039]: ERROR nova.conductor.manager [ 666.671805] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 666.671805] nova-conductor[53039]: ERROR nova.conductor.manager [ 666.671805] nova-conductor[53039]: ERROR nova.conductor.manager [ 666.682741] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1a88a0d6-7b5a-4120-b197-cbeac4081cac tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 666.682962] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1a88a0d6-7b5a-4120-b197-cbeac4081cac tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 666.683144] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1a88a0d6-7b5a-4120-b197-cbeac4081cac tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 666.743615] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-1a88a0d6-7b5a-4120-b197-cbeac4081cac tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: eb2774ba-b9c2-48d7-b5bc-3580d02523ce] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 666.743745] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1a88a0d6-7b5a-4120-b197-cbeac4081cac tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 666.743953] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1a88a0d6-7b5a-4120-b197-cbeac4081cac tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 666.744150] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1a88a0d6-7b5a-4120-b197-cbeac4081cac tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 666.751089] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-1a88a0d6-7b5a-4120-b197-cbeac4081cac tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 666.751089] nova-conductor[53039]: Traceback (most recent call last): [ 666.751089] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 666.751089] nova-conductor[53039]: return func(*args, **kwargs) [ 666.751089] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 666.751089] nova-conductor[53039]: selections = self._select_destinations( [ 666.751089] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 666.751089] nova-conductor[53039]: selections = self._schedule( [ 666.751089] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 666.751089] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 666.751089] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 666.751089] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 666.751089] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 666.751089] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 666.751635] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-1a88a0d6-7b5a-4120-b197-cbeac4081cac tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: eb2774ba-b9c2-48d7-b5bc-3580d02523ce] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 668.110835] nova-conductor[53040]: ERROR nova.conductor.manager [None req-2c321776-543b-4c5f-b401-1bd8256d8b5a tempest-AttachInterfacesV270Test-17097979 tempest-AttachInterfacesV270Test-17097979-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 668.110835] nova-conductor[53040]: Traceback (most recent call last): [ 668.110835] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 668.110835] nova-conductor[53040]: return func(*args, **kwargs) [ 668.110835] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 668.110835] nova-conductor[53040]: selections = self._select_destinations( [ 668.110835] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 668.110835] nova-conductor[53040]: selections = self._schedule( [ 668.110835] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 668.110835] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 668.110835] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 668.110835] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 668.110835] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 668.110835] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 668.110835] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 668.110835] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 668.110835] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 668.110835] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 668.110835] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 668.110835] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 668.110835] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 668.111651] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 668.111651] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 668.111651] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 668.111651] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 668.111651] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 668.111651] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 668.111651] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 668.111651] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 668.111651] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 668.111651] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 668.111651] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 668.111651] nova-conductor[53040]: ERROR nova.conductor.manager [ 668.111651] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 668.111651] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 668.111651] nova-conductor[53040]: ERROR nova.conductor.manager [ 668.111651] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 668.111651] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 668.111651] nova-conductor[53040]: ERROR nova.conductor.manager [ 668.111651] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 668.111651] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 668.111651] nova-conductor[53040]: ERROR nova.conductor.manager [ 668.112353] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 668.112353] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 668.112353] nova-conductor[53040]: ERROR nova.conductor.manager [ 668.112353] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 668.112353] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 668.112353] nova-conductor[53040]: ERROR nova.conductor.manager [ 668.112353] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 668.112353] nova-conductor[53040]: ERROR nova.conductor.manager [ 668.112353] nova-conductor[53040]: ERROR nova.conductor.manager [ 668.121044] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-2c321776-543b-4c5f-b401-1bd8256d8b5a tempest-AttachInterfacesV270Test-17097979 tempest-AttachInterfacesV270Test-17097979-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.121044] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-2c321776-543b-4c5f-b401-1bd8256d8b5a tempest-AttachInterfacesV270Test-17097979 tempest-AttachInterfacesV270Test-17097979-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.121044] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-2c321776-543b-4c5f-b401-1bd8256d8b5a tempest-AttachInterfacesV270Test-17097979 tempest-AttachInterfacesV270Test-17097979-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.191279] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-2c321776-543b-4c5f-b401-1bd8256d8b5a tempest-AttachInterfacesV270Test-17097979 tempest-AttachInterfacesV270Test-17097979-project-member] [instance: c599de60-d129-49fa-abec-cd1d111fa501] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 668.192009] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-2c321776-543b-4c5f-b401-1bd8256d8b5a tempest-AttachInterfacesV270Test-17097979 tempest-AttachInterfacesV270Test-17097979-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.192264] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-2c321776-543b-4c5f-b401-1bd8256d8b5a tempest-AttachInterfacesV270Test-17097979 tempest-AttachInterfacesV270Test-17097979-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.192399] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-2c321776-543b-4c5f-b401-1bd8256d8b5a tempest-AttachInterfacesV270Test-17097979 tempest-AttachInterfacesV270Test-17097979-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.195687] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-2c321776-543b-4c5f-b401-1bd8256d8b5a tempest-AttachInterfacesV270Test-17097979 tempest-AttachInterfacesV270Test-17097979-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 668.195687] nova-conductor[53040]: Traceback (most recent call last): [ 668.195687] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 668.195687] nova-conductor[53040]: return func(*args, **kwargs) [ 668.195687] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 668.195687] nova-conductor[53040]: selections = self._select_destinations( [ 668.195687] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 668.195687] nova-conductor[53040]: selections = self._schedule( [ 668.195687] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 668.195687] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 668.195687] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 668.195687] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 668.195687] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 668.195687] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 668.196211] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-2c321776-543b-4c5f-b401-1bd8256d8b5a tempest-AttachInterfacesV270Test-17097979 tempest-AttachInterfacesV270Test-17097979-project-member] [instance: c599de60-d129-49fa-abec-cd1d111fa501] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 671.755215] nova-conductor[53039]: ERROR nova.conductor.manager [None req-007cc61d-6509-4f02-acad-8f20be5d6da7 tempest-ServerShowV257Test-2019902689 tempest-ServerShowV257Test-2019902689-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 671.755215] nova-conductor[53039]: Traceback (most recent call last): [ 671.755215] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 671.755215] nova-conductor[53039]: return func(*args, **kwargs) [ 671.755215] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 671.755215] nova-conductor[53039]: selections = self._select_destinations( [ 671.755215] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 671.755215] nova-conductor[53039]: selections = self._schedule( [ 671.755215] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 671.755215] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 671.755215] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 671.755215] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 671.755215] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 671.755215] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 671.755215] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 671.755215] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 671.755215] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 671.755215] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 671.755215] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 671.755215] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 671.755215] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 671.756023] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 671.756023] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 671.756023] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 671.756023] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 671.756023] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 671.756023] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 671.756023] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 671.756023] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 671.756023] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 671.756023] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 671.756023] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 671.756023] nova-conductor[53039]: ERROR nova.conductor.manager [ 671.756023] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 671.756023] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 671.756023] nova-conductor[53039]: ERROR nova.conductor.manager [ 671.756023] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 671.756023] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 671.756023] nova-conductor[53039]: ERROR nova.conductor.manager [ 671.756023] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 671.756023] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 671.756023] nova-conductor[53039]: ERROR nova.conductor.manager [ 671.756759] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 671.756759] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 671.756759] nova-conductor[53039]: ERROR nova.conductor.manager [ 671.756759] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 671.756759] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 671.756759] nova-conductor[53039]: ERROR nova.conductor.manager [ 671.756759] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 671.756759] nova-conductor[53039]: ERROR nova.conductor.manager [ 671.756759] nova-conductor[53039]: ERROR nova.conductor.manager [ 671.762064] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-007cc61d-6509-4f02-acad-8f20be5d6da7 tempest-ServerShowV257Test-2019902689 tempest-ServerShowV257Test-2019902689-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 671.762322] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-007cc61d-6509-4f02-acad-8f20be5d6da7 tempest-ServerShowV257Test-2019902689 tempest-ServerShowV257Test-2019902689-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 671.762742] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-007cc61d-6509-4f02-acad-8f20be5d6da7 tempest-ServerShowV257Test-2019902689 tempest-ServerShowV257Test-2019902689-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 671.823639] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-007cc61d-6509-4f02-acad-8f20be5d6da7 tempest-ServerShowV257Test-2019902689 tempest-ServerShowV257Test-2019902689-project-member] [instance: d42b72bc-5877-4fb5-bd52-0c741f0deaf5] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 671.824438] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-007cc61d-6509-4f02-acad-8f20be5d6da7 tempest-ServerShowV257Test-2019902689 tempest-ServerShowV257Test-2019902689-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 671.824655] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-007cc61d-6509-4f02-acad-8f20be5d6da7 tempest-ServerShowV257Test-2019902689 tempest-ServerShowV257Test-2019902689-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 671.824870] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-007cc61d-6509-4f02-acad-8f20be5d6da7 tempest-ServerShowV257Test-2019902689 tempest-ServerShowV257Test-2019902689-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 671.830584] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-007cc61d-6509-4f02-acad-8f20be5d6da7 tempest-ServerShowV257Test-2019902689 tempest-ServerShowV257Test-2019902689-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 671.830584] nova-conductor[53039]: Traceback (most recent call last): [ 671.830584] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 671.830584] nova-conductor[53039]: return func(*args, **kwargs) [ 671.830584] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 671.830584] nova-conductor[53039]: selections = self._select_destinations( [ 671.830584] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 671.830584] nova-conductor[53039]: selections = self._schedule( [ 671.830584] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 671.830584] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 671.830584] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 671.830584] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 671.830584] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 671.830584] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 671.831214] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-007cc61d-6509-4f02-acad-8f20be5d6da7 tempest-ServerShowV257Test-2019902689 tempest-ServerShowV257Test-2019902689-project-member] [instance: d42b72bc-5877-4fb5-bd52-0c741f0deaf5] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 673.279221] nova-conductor[53040]: ERROR nova.conductor.manager [None req-7ece6ffe-a5b1-4bdf-87f8-f670f9d5ad21 tempest-ServersTestManualDisk-1800173937 tempest-ServersTestManualDisk-1800173937-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 673.279221] nova-conductor[53040]: Traceback (most recent call last): [ 673.279221] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 673.279221] nova-conductor[53040]: return func(*args, **kwargs) [ 673.279221] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 673.279221] nova-conductor[53040]: selections = self._select_destinations( [ 673.279221] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 673.279221] nova-conductor[53040]: selections = self._schedule( [ 673.279221] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 673.279221] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 673.279221] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 673.279221] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 673.279221] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 673.279221] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 673.279221] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 673.279221] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 673.279221] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 673.279221] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 673.279221] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 673.279221] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 673.279221] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 673.280053] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 673.280053] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 673.280053] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 673.280053] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 673.280053] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 673.280053] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 673.280053] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 673.280053] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 673.280053] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 673.280053] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 673.280053] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 673.280053] nova-conductor[53040]: ERROR nova.conductor.manager [ 673.280053] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 673.280053] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 673.280053] nova-conductor[53040]: ERROR nova.conductor.manager [ 673.280053] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 673.280053] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 673.280053] nova-conductor[53040]: ERROR nova.conductor.manager [ 673.280053] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 673.280053] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 673.280053] nova-conductor[53040]: ERROR nova.conductor.manager [ 673.280632] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 673.280632] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 673.280632] nova-conductor[53040]: ERROR nova.conductor.manager [ 673.280632] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 673.280632] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 673.280632] nova-conductor[53040]: ERROR nova.conductor.manager [ 673.280632] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 673.280632] nova-conductor[53040]: ERROR nova.conductor.manager [ 673.280632] nova-conductor[53040]: ERROR nova.conductor.manager [ 673.286157] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7ece6ffe-a5b1-4bdf-87f8-f670f9d5ad21 tempest-ServersTestManualDisk-1800173937 tempest-ServersTestManualDisk-1800173937-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 673.286462] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7ece6ffe-a5b1-4bdf-87f8-f670f9d5ad21 tempest-ServersTestManualDisk-1800173937 tempest-ServersTestManualDisk-1800173937-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 673.286926] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7ece6ffe-a5b1-4bdf-87f8-f670f9d5ad21 tempest-ServersTestManualDisk-1800173937 tempest-ServersTestManualDisk-1800173937-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 673.320798] nova-conductor[53039]: ERROR nova.conductor.manager [None req-9a45584c-8140-4832-ae6b-fce014947f32 tempest-ImagesNegativeTestJSON-1901723470 tempest-ImagesNegativeTestJSON-1901723470-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 673.320798] nova-conductor[53039]: Traceback (most recent call last): [ 673.320798] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 673.320798] nova-conductor[53039]: return func(*args, **kwargs) [ 673.320798] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 673.320798] nova-conductor[53039]: selections = self._select_destinations( [ 673.320798] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 673.320798] nova-conductor[53039]: selections = self._schedule( [ 673.320798] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 673.320798] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 673.320798] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 673.320798] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 673.320798] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 673.320798] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 673.320798] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 673.320798] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 673.320798] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 673.320798] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 673.320798] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 673.320798] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 673.320798] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 673.321770] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 673.321770] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 673.321770] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 673.321770] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 673.321770] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 673.321770] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 673.321770] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 673.321770] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 673.321770] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 673.321770] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 673.321770] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 673.321770] nova-conductor[53039]: ERROR nova.conductor.manager [ 673.321770] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 673.321770] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 673.321770] nova-conductor[53039]: ERROR nova.conductor.manager [ 673.321770] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 673.321770] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 673.321770] nova-conductor[53039]: ERROR nova.conductor.manager [ 673.321770] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 673.321770] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 673.321770] nova-conductor[53039]: ERROR nova.conductor.manager [ 673.322438] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 673.322438] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 673.322438] nova-conductor[53039]: ERROR nova.conductor.manager [ 673.322438] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 673.322438] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 673.322438] nova-conductor[53039]: ERROR nova.conductor.manager [ 673.322438] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 673.322438] nova-conductor[53039]: ERROR nova.conductor.manager [ 673.322438] nova-conductor[53039]: ERROR nova.conductor.manager [ 673.326845] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-9a45584c-8140-4832-ae6b-fce014947f32 tempest-ImagesNegativeTestJSON-1901723470 tempest-ImagesNegativeTestJSON-1901723470-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 673.327107] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-9a45584c-8140-4832-ae6b-fce014947f32 tempest-ImagesNegativeTestJSON-1901723470 tempest-ImagesNegativeTestJSON-1901723470-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 673.327287] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-9a45584c-8140-4832-ae6b-fce014947f32 tempest-ImagesNegativeTestJSON-1901723470 tempest-ImagesNegativeTestJSON-1901723470-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 673.352348] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-7ece6ffe-a5b1-4bdf-87f8-f670f9d5ad21 tempest-ServersTestManualDisk-1800173937 tempest-ServersTestManualDisk-1800173937-project-member] [instance: 88c9242f-9cc5-4338-a917-b3d1ba386987] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 673.353093] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7ece6ffe-a5b1-4bdf-87f8-f670f9d5ad21 tempest-ServersTestManualDisk-1800173937 tempest-ServersTestManualDisk-1800173937-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 673.353340] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7ece6ffe-a5b1-4bdf-87f8-f670f9d5ad21 tempest-ServersTestManualDisk-1800173937 tempest-ServersTestManualDisk-1800173937-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 673.353478] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7ece6ffe-a5b1-4bdf-87f8-f670f9d5ad21 tempest-ServersTestManualDisk-1800173937 tempest-ServersTestManualDisk-1800173937-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 673.362022] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-7ece6ffe-a5b1-4bdf-87f8-f670f9d5ad21 tempest-ServersTestManualDisk-1800173937 tempest-ServersTestManualDisk-1800173937-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 673.362022] nova-conductor[53040]: Traceback (most recent call last): [ 673.362022] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 673.362022] nova-conductor[53040]: return func(*args, **kwargs) [ 673.362022] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 673.362022] nova-conductor[53040]: selections = self._select_destinations( [ 673.362022] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 673.362022] nova-conductor[53040]: selections = self._schedule( [ 673.362022] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 673.362022] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 673.362022] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 673.362022] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 673.362022] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 673.362022] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 673.362562] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-7ece6ffe-a5b1-4bdf-87f8-f670f9d5ad21 tempest-ServersTestManualDisk-1800173937 tempest-ServersTestManualDisk-1800173937-project-member] [instance: 88c9242f-9cc5-4338-a917-b3d1ba386987] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 673.372579] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-9a45584c-8140-4832-ae6b-fce014947f32 tempest-ImagesNegativeTestJSON-1901723470 tempest-ImagesNegativeTestJSON-1901723470-project-member] [instance: a36df24c-379e-4194-b8eb-f73b9761fc7f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 673.373332] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-9a45584c-8140-4832-ae6b-fce014947f32 tempest-ImagesNegativeTestJSON-1901723470 tempest-ImagesNegativeTestJSON-1901723470-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 673.373555] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-9a45584c-8140-4832-ae6b-fce014947f32 tempest-ImagesNegativeTestJSON-1901723470 tempest-ImagesNegativeTestJSON-1901723470-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 673.373729] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-9a45584c-8140-4832-ae6b-fce014947f32 tempest-ImagesNegativeTestJSON-1901723470 tempest-ImagesNegativeTestJSON-1901723470-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 673.377609] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-9a45584c-8140-4832-ae6b-fce014947f32 tempest-ImagesNegativeTestJSON-1901723470 tempest-ImagesNegativeTestJSON-1901723470-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 673.377609] nova-conductor[53039]: Traceback (most recent call last): [ 673.377609] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 673.377609] nova-conductor[53039]: return func(*args, **kwargs) [ 673.377609] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 673.377609] nova-conductor[53039]: selections = self._select_destinations( [ 673.377609] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 673.377609] nova-conductor[53039]: selections = self._schedule( [ 673.377609] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 673.377609] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 673.377609] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 673.377609] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 673.377609] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 673.377609] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 673.377609] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-9a45584c-8140-4832-ae6b-fce014947f32 tempest-ImagesNegativeTestJSON-1901723470 tempest-ImagesNegativeTestJSON-1901723470-project-member] [instance: a36df24c-379e-4194-b8eb-f73b9761fc7f] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 687.349703] nova-conductor[53039]: ERROR nova.scheduler.utils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 148f525a-f3c0-40f2-8527-9607cd5e581b was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 687.350797] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Rescheduling: True {{(pid=53039) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 687.354084] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 148f525a-f3c0-40f2-8527-9607cd5e581b.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 148f525a-f3c0-40f2-8527-9607cd5e581b. [ 687.354341] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 148f525a-f3c0-40f2-8527-9607cd5e581b. [ 687.396019] nova-conductor[53039]: DEBUG nova.network.neutron [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] deallocate_for_instance() {{(pid=53039) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 687.618935] nova-conductor[53039]: DEBUG nova.network.neutron [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Instance cache missing network info. {{(pid=53039) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 687.623150] nova-conductor[53039]: DEBUG nova.network.neutron [None req-f4e64849-96c2-44a0-ac6a-3831aeb91407 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 148f525a-f3c0-40f2-8527-9607cd5e581b] Updating instance_info_cache with network_info: [] {{(pid=53039) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager [None req-554263da-efb4-44f2-b170-1125913e8957 tempest-ServerActionsTestOtherA-1736780769 tempest-ServerActionsTestOtherA-1736780769-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 688.811721] nova-conductor[53039]: Traceback (most recent call last): [ 688.811721] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 688.811721] nova-conductor[53039]: return func(*args, **kwargs) [ 688.811721] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 688.811721] nova-conductor[53039]: selections = self._select_destinations( [ 688.811721] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 688.811721] nova-conductor[53039]: selections = self._schedule( [ 688.811721] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 688.811721] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 688.811721] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 688.811721] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 688.811721] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager [ 688.811721] nova-conductor[53039]: ERROR nova.conductor.manager [ 688.819222] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-554263da-efb4-44f2-b170-1125913e8957 tempest-ServerActionsTestOtherA-1736780769 tempest-ServerActionsTestOtherA-1736780769-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 688.819542] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-554263da-efb4-44f2-b170-1125913e8957 tempest-ServerActionsTestOtherA-1736780769 tempest-ServerActionsTestOtherA-1736780769-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 688.819742] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-554263da-efb4-44f2-b170-1125913e8957 tempest-ServerActionsTestOtherA-1736780769 tempest-ServerActionsTestOtherA-1736780769-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 688.875659] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-554263da-efb4-44f2-b170-1125913e8957 tempest-ServerActionsTestOtherA-1736780769 tempest-ServerActionsTestOtherA-1736780769-project-member] [instance: 1e75c548-67e3-4360-89b5-b7e70b2faa4e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 688.876421] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-554263da-efb4-44f2-b170-1125913e8957 tempest-ServerActionsTestOtherA-1736780769 tempest-ServerActionsTestOtherA-1736780769-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 688.877287] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-554263da-efb4-44f2-b170-1125913e8957 tempest-ServerActionsTestOtherA-1736780769 tempest-ServerActionsTestOtherA-1736780769-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 688.877287] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-554263da-efb4-44f2-b170-1125913e8957 tempest-ServerActionsTestOtherA-1736780769 tempest-ServerActionsTestOtherA-1736780769-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 688.879999] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-554263da-efb4-44f2-b170-1125913e8957 tempest-ServerActionsTestOtherA-1736780769 tempest-ServerActionsTestOtherA-1736780769-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 688.879999] nova-conductor[53039]: Traceback (most recent call last): [ 688.879999] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 688.879999] nova-conductor[53039]: return func(*args, **kwargs) [ 688.879999] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 688.879999] nova-conductor[53039]: selections = self._select_destinations( [ 688.879999] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 688.879999] nova-conductor[53039]: selections = self._schedule( [ 688.879999] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 688.879999] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 688.879999] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 688.879999] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 688.879999] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 688.879999] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 688.880708] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-554263da-efb4-44f2-b170-1125913e8957 tempest-ServerActionsTestOtherA-1736780769 tempest-ServerActionsTestOtherA-1736780769-project-member] [instance: 1e75c548-67e3-4360-89b5-b7e70b2faa4e] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager [None req-57f2134b-d22c-4692-ba20-673bd6a0e3d4 tempest-ServerActionsV293TestJSON-670399220 tempest-ServerActionsV293TestJSON-670399220-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 690.791159] nova-conductor[53040]: Traceback (most recent call last): [ 690.791159] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 690.791159] nova-conductor[53040]: return func(*args, **kwargs) [ 690.791159] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 690.791159] nova-conductor[53040]: selections = self._select_destinations( [ 690.791159] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 690.791159] nova-conductor[53040]: selections = self._schedule( [ 690.791159] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 690.791159] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 690.791159] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 690.791159] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 690.791159] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager [ 690.791159] nova-conductor[53040]: ERROR nova.conductor.manager [ 690.808753] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-57f2134b-d22c-4692-ba20-673bd6a0e3d4 tempest-ServerActionsV293TestJSON-670399220 tempest-ServerActionsV293TestJSON-670399220-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.809103] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-57f2134b-d22c-4692-ba20-673bd6a0e3d4 tempest-ServerActionsV293TestJSON-670399220 tempest-ServerActionsV293TestJSON-670399220-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.809103] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-57f2134b-d22c-4692-ba20-673bd6a0e3d4 tempest-ServerActionsV293TestJSON-670399220 tempest-ServerActionsV293TestJSON-670399220-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager [None req-183ef61d-ad35-45c3-91a9-e0bce92ab34e tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 690.852461] nova-conductor[53039]: Traceback (most recent call last): [ 690.852461] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 690.852461] nova-conductor[53039]: return func(*args, **kwargs) [ 690.852461] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 690.852461] nova-conductor[53039]: selections = self._select_destinations( [ 690.852461] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 690.852461] nova-conductor[53039]: selections = self._schedule( [ 690.852461] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 690.852461] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 690.852461] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 690.852461] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 690.852461] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager [ 690.852461] nova-conductor[53039]: ERROR nova.conductor.manager [ 690.868769] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-183ef61d-ad35-45c3-91a9-e0bce92ab34e tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.868887] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-183ef61d-ad35-45c3-91a9-e0bce92ab34e tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.870857] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-183ef61d-ad35-45c3-91a9-e0bce92ab34e tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.885576] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-57f2134b-d22c-4692-ba20-673bd6a0e3d4 tempest-ServerActionsV293TestJSON-670399220 tempest-ServerActionsV293TestJSON-670399220-project-member] [instance: 405eb9a5-29a7-4838-82a4-38e2a5ae6216] block_device_mapping [BlockDeviceMapping(attachment_id=95fa9e6a-62af-4e4f-8b1e-96acbcdd3331,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='volume',device_name=None,device_type=None,disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id=None,instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='volume',tag=None,updated_at=,uuid=,volume_id='97d3dbb8-7661-40c5-84b5-828336e193b3',volume_size=1,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 690.886386] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-57f2134b-d22c-4692-ba20-673bd6a0e3d4 tempest-ServerActionsV293TestJSON-670399220 tempest-ServerActionsV293TestJSON-670399220-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.886590] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-57f2134b-d22c-4692-ba20-673bd6a0e3d4 tempest-ServerActionsV293TestJSON-670399220 tempest-ServerActionsV293TestJSON-670399220-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.886754] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-57f2134b-d22c-4692-ba20-673bd6a0e3d4 tempest-ServerActionsV293TestJSON-670399220 tempest-ServerActionsV293TestJSON-670399220-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.890908] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-57f2134b-d22c-4692-ba20-673bd6a0e3d4 tempest-ServerActionsV293TestJSON-670399220 tempest-ServerActionsV293TestJSON-670399220-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 690.890908] nova-conductor[53040]: Traceback (most recent call last): [ 690.890908] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 690.890908] nova-conductor[53040]: return func(*args, **kwargs) [ 690.890908] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 690.890908] nova-conductor[53040]: selections = self._select_destinations( [ 690.890908] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 690.890908] nova-conductor[53040]: selections = self._schedule( [ 690.890908] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 690.890908] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 690.890908] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 690.890908] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 690.890908] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 690.890908] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 690.892122] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-57f2134b-d22c-4692-ba20-673bd6a0e3d4 tempest-ServerActionsV293TestJSON-670399220 tempest-ServerActionsV293TestJSON-670399220-project-member] [instance: 405eb9a5-29a7-4838-82a4-38e2a5ae6216] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 690.937898] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-183ef61d-ad35-45c3-91a9-e0bce92ab34e tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] [instance: 64ff238e-335c-4126-8cc6-fcee1bc491c2] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 690.938651] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-183ef61d-ad35-45c3-91a9-e0bce92ab34e tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.938864] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-183ef61d-ad35-45c3-91a9-e0bce92ab34e tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.939037] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-183ef61d-ad35-45c3-91a9-e0bce92ab34e tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.948028] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-183ef61d-ad35-45c3-91a9-e0bce92ab34e tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 690.948028] nova-conductor[53039]: Traceback (most recent call last): [ 690.948028] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 690.948028] nova-conductor[53039]: return func(*args, **kwargs) [ 690.948028] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 690.948028] nova-conductor[53039]: selections = self._select_destinations( [ 690.948028] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 690.948028] nova-conductor[53039]: selections = self._schedule( [ 690.948028] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 690.948028] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 690.948028] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 690.948028] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 690.948028] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 690.948028] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 690.950545] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-183ef61d-ad35-45c3-91a9-e0bce92ab34e tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] [instance: 64ff238e-335c-4126-8cc6-fcee1bc491c2] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 690.989443] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-183ef61d-ad35-45c3-91a9-e0bce92ab34e tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.990074] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-183ef61d-ad35-45c3-91a9-e0bce92ab34e tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.990074] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-183ef61d-ad35-45c3-91a9-e0bce92ab34e tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.048650] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-183ef61d-ad35-45c3-91a9-e0bce92ab34e tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] [instance: 1927aa3e-4d9e-43d4-a792-168b61fcf0cd] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 691.049014] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-183ef61d-ad35-45c3-91a9-e0bce92ab34e tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.049227] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-183ef61d-ad35-45c3-91a9-e0bce92ab34e tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.049392] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-183ef61d-ad35-45c3-91a9-e0bce92ab34e tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.059875] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-183ef61d-ad35-45c3-91a9-e0bce92ab34e tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 691.059875] nova-conductor[53039]: Traceback (most recent call last): [ 691.059875] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 691.059875] nova-conductor[53039]: return func(*args, **kwargs) [ 691.059875] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 691.059875] nova-conductor[53039]: selections = self._select_destinations( [ 691.059875] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 691.059875] nova-conductor[53039]: selections = self._schedule( [ 691.059875] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 691.059875] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 691.059875] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 691.059875] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 691.059875] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 691.059875] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 691.062016] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-183ef61d-ad35-45c3-91a9-e0bce92ab34e tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] [instance: 1927aa3e-4d9e-43d4-a792-168b61fcf0cd] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager [None req-943779d7-64aa-4938-9035-91280eddefd2 tempest-InstanceActionsNegativeTestJSON-1805904341 tempest-InstanceActionsNegativeTestJSON-1805904341-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 691.544408] nova-conductor[53040]: Traceback (most recent call last): [ 691.544408] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 691.544408] nova-conductor[53040]: return func(*args, **kwargs) [ 691.544408] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 691.544408] nova-conductor[53040]: selections = self._select_destinations( [ 691.544408] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 691.544408] nova-conductor[53040]: selections = self._schedule( [ 691.544408] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 691.544408] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 691.544408] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 691.544408] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 691.544408] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager [ 691.544408] nova-conductor[53040]: ERROR nova.conductor.manager [ 691.562189] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-943779d7-64aa-4938-9035-91280eddefd2 tempest-InstanceActionsNegativeTestJSON-1805904341 tempest-InstanceActionsNegativeTestJSON-1805904341-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.562423] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-943779d7-64aa-4938-9035-91280eddefd2 tempest-InstanceActionsNegativeTestJSON-1805904341 tempest-InstanceActionsNegativeTestJSON-1805904341-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.562600] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-943779d7-64aa-4938-9035-91280eddefd2 tempest-InstanceActionsNegativeTestJSON-1805904341 tempest-InstanceActionsNegativeTestJSON-1805904341-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.615364] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-943779d7-64aa-4938-9035-91280eddefd2 tempest-InstanceActionsNegativeTestJSON-1805904341 tempest-InstanceActionsNegativeTestJSON-1805904341-project-member] [instance: c601cfc8-152b-4bdf-854f-55ec681203e0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 691.616070] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-943779d7-64aa-4938-9035-91280eddefd2 tempest-InstanceActionsNegativeTestJSON-1805904341 tempest-InstanceActionsNegativeTestJSON-1805904341-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.616275] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-943779d7-64aa-4938-9035-91280eddefd2 tempest-InstanceActionsNegativeTestJSON-1805904341 tempest-InstanceActionsNegativeTestJSON-1805904341-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.616435] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-943779d7-64aa-4938-9035-91280eddefd2 tempest-InstanceActionsNegativeTestJSON-1805904341 tempest-InstanceActionsNegativeTestJSON-1805904341-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.622542] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-943779d7-64aa-4938-9035-91280eddefd2 tempest-InstanceActionsNegativeTestJSON-1805904341 tempest-InstanceActionsNegativeTestJSON-1805904341-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 691.622542] nova-conductor[53040]: Traceback (most recent call last): [ 691.622542] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 691.622542] nova-conductor[53040]: return func(*args, **kwargs) [ 691.622542] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 691.622542] nova-conductor[53040]: selections = self._select_destinations( [ 691.622542] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 691.622542] nova-conductor[53040]: selections = self._schedule( [ 691.622542] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 691.622542] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 691.622542] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 691.622542] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 691.622542] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 691.622542] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 691.622542] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-943779d7-64aa-4938-9035-91280eddefd2 tempest-InstanceActionsNegativeTestJSON-1805904341 tempest-InstanceActionsNegativeTestJSON-1805904341-project-member] [instance: c601cfc8-152b-4bdf-854f-55ec681203e0] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager [None req-963f0025-c6ed-48cf-8c75-28399d92ffbb tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 692.890962] nova-conductor[53039]: Traceback (most recent call last): [ 692.890962] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 692.890962] nova-conductor[53039]: return func(*args, **kwargs) [ 692.890962] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 692.890962] nova-conductor[53039]: selections = self._select_destinations( [ 692.890962] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 692.890962] nova-conductor[53039]: selections = self._schedule( [ 692.890962] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 692.890962] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 692.890962] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 692.890962] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 692.890962] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager [ 692.890962] nova-conductor[53039]: ERROR nova.conductor.manager [ 692.901138] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-963f0025-c6ed-48cf-8c75-28399d92ffbb tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 692.901138] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-963f0025-c6ed-48cf-8c75-28399d92ffbb tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 692.901138] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-963f0025-c6ed-48cf-8c75-28399d92ffbb tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 692.962446] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-963f0025-c6ed-48cf-8c75-28399d92ffbb tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: c0128068-3faa-4f0d-9905-336f75e2ca7d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 692.963451] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-963f0025-c6ed-48cf-8c75-28399d92ffbb tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 692.963451] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-963f0025-c6ed-48cf-8c75-28399d92ffbb tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 692.963552] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-963f0025-c6ed-48cf-8c75-28399d92ffbb tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 692.967512] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-963f0025-c6ed-48cf-8c75-28399d92ffbb tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 692.967512] nova-conductor[53039]: Traceback (most recent call last): [ 692.967512] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 692.967512] nova-conductor[53039]: return func(*args, **kwargs) [ 692.967512] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 692.967512] nova-conductor[53039]: selections = self._select_destinations( [ 692.967512] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 692.967512] nova-conductor[53039]: selections = self._schedule( [ 692.967512] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 692.967512] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 692.967512] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 692.967512] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 692.967512] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 692.967512] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 692.967952] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-963f0025-c6ed-48cf-8c75-28399d92ffbb tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: c0128068-3faa-4f0d-9905-336f75e2ca7d] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager [None req-1eba63bf-23de-4512-99aa-6214fd104547 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 692.974162] nova-conductor[53040]: Traceback (most recent call last): [ 692.974162] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 692.974162] nova-conductor[53040]: return func(*args, **kwargs) [ 692.974162] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 692.974162] nova-conductor[53040]: selections = self._select_destinations( [ 692.974162] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 692.974162] nova-conductor[53040]: selections = self._schedule( [ 692.974162] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 692.974162] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 692.974162] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 692.974162] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 692.974162] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager [ 692.974162] nova-conductor[53040]: ERROR nova.conductor.manager [ 692.979138] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-1eba63bf-23de-4512-99aa-6214fd104547 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 692.979365] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-1eba63bf-23de-4512-99aa-6214fd104547 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 692.979578] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-1eba63bf-23de-4512-99aa-6214fd104547 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.035275] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-1eba63bf-23de-4512-99aa-6214fd104547 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: cd350089-076b-45b0-84c3-5c4e041ca578] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 693.035275] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-1eba63bf-23de-4512-99aa-6214fd104547 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.035275] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-1eba63bf-23de-4512-99aa-6214fd104547 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.035275] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-1eba63bf-23de-4512-99aa-6214fd104547 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.040389] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-1eba63bf-23de-4512-99aa-6214fd104547 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 693.040389] nova-conductor[53040]: Traceback (most recent call last): [ 693.040389] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 693.040389] nova-conductor[53040]: return func(*args, **kwargs) [ 693.040389] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 693.040389] nova-conductor[53040]: selections = self._select_destinations( [ 693.040389] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 693.040389] nova-conductor[53040]: selections = self._schedule( [ 693.040389] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 693.040389] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 693.040389] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 693.040389] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 693.040389] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 693.040389] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 693.041987] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-1eba63bf-23de-4512-99aa-6214fd104547 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: cd350089-076b-45b0-84c3-5c4e041ca578] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 694.677017] nova-conductor[53039]: Traceback (most recent call last): [ 694.677017] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 694.677017] nova-conductor[53039]: return func(*args, **kwargs) [ 694.677017] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 694.677017] nova-conductor[53039]: selections = self._select_destinations( [ 694.677017] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 694.677017] nova-conductor[53039]: selections = self._schedule( [ 694.677017] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 694.677017] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 694.677017] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 694.677017] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 694.677017] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager [ 694.677017] nova-conductor[53039]: ERROR nova.conductor.manager [ 694.688486] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 694.688760] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 694.688971] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 694.743287] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] [instance: b3bbeeba-c909-4e99-9482-519444a06f7b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 694.744110] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 694.744387] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 694.744594] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 694.747854] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 694.747854] nova-conductor[53039]: Traceback (most recent call last): [ 694.747854] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 694.747854] nova-conductor[53039]: return func(*args, **kwargs) [ 694.747854] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 694.747854] nova-conductor[53039]: selections = self._select_destinations( [ 694.747854] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 694.747854] nova-conductor[53039]: selections = self._schedule( [ 694.747854] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 694.747854] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 694.747854] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 694.747854] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 694.747854] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 694.747854] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 694.748598] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] [instance: b3bbeeba-c909-4e99-9482-519444a06f7b] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 694.776252] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 694.776471] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 694.777245] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 694.818062] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] [instance: a5c90864-f54e-4378-8ac4-a4ca8b1ee55b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 694.818696] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 694.818914] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 694.819181] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 694.822449] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 694.822449] nova-conductor[53039]: Traceback (most recent call last): [ 694.822449] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 694.822449] nova-conductor[53039]: return func(*args, **kwargs) [ 694.822449] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 694.822449] nova-conductor[53039]: selections = self._select_destinations( [ 694.822449] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 694.822449] nova-conductor[53039]: selections = self._schedule( [ 694.822449] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 694.822449] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 694.822449] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 694.822449] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 694.822449] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 694.822449] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 694.823028] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] [instance: a5c90864-f54e-4378-8ac4-a4ca8b1ee55b] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 694.858020] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 694.858020] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 694.858020] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 694.907390] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] [instance: 475b06c8-c936-47a7-a969-f1a43b0eab9a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 694.908123] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 694.908928] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 694.908928] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 694.911591] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 694.911591] nova-conductor[53039]: Traceback (most recent call last): [ 694.911591] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 694.911591] nova-conductor[53039]: return func(*args, **kwargs) [ 694.911591] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 694.911591] nova-conductor[53039]: selections = self._select_destinations( [ 694.911591] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 694.911591] nova-conductor[53039]: selections = self._schedule( [ 694.911591] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 694.911591] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 694.911591] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 694.911591] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 694.911591] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 694.911591] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 694.912117] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-51e87503-b4ab-44c4-aa3b-fba9431c2c27 tempest-ListServersNegativeTestJSON-1169879222 tempest-ListServersNegativeTestJSON-1169879222-project-member] [instance: 475b06c8-c936-47a7-a969-f1a43b0eab9a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager [None req-f3215e6d-3a9e-40ac-a3f0-026d8e4d9b6e tempest-ServerPasswordTestJSON-1057914730 tempest-ServerPasswordTestJSON-1057914730-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 694.965466] nova-conductor[53040]: Traceback (most recent call last): [ 694.965466] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 694.965466] nova-conductor[53040]: return func(*args, **kwargs) [ 694.965466] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 694.965466] nova-conductor[53040]: selections = self._select_destinations( [ 694.965466] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 694.965466] nova-conductor[53040]: selections = self._schedule( [ 694.965466] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 694.965466] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 694.965466] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 694.965466] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 694.965466] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager [ 694.965466] nova-conductor[53040]: ERROR nova.conductor.manager [ 694.972597] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f3215e6d-3a9e-40ac-a3f0-026d8e4d9b6e tempest-ServerPasswordTestJSON-1057914730 tempest-ServerPasswordTestJSON-1057914730-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 694.972781] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f3215e6d-3a9e-40ac-a3f0-026d8e4d9b6e tempest-ServerPasswordTestJSON-1057914730 tempest-ServerPasswordTestJSON-1057914730-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 694.972953] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f3215e6d-3a9e-40ac-a3f0-026d8e4d9b6e tempest-ServerPasswordTestJSON-1057914730 tempest-ServerPasswordTestJSON-1057914730-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 695.031033] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-f3215e6d-3a9e-40ac-a3f0-026d8e4d9b6e tempest-ServerPasswordTestJSON-1057914730 tempest-ServerPasswordTestJSON-1057914730-project-member] [instance: 5de30574-c0b5-4db3-9129-087d7dd990fc] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 695.031753] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f3215e6d-3a9e-40ac-a3f0-026d8e4d9b6e tempest-ServerPasswordTestJSON-1057914730 tempest-ServerPasswordTestJSON-1057914730-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 695.032010] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f3215e6d-3a9e-40ac-a3f0-026d8e4d9b6e tempest-ServerPasswordTestJSON-1057914730 tempest-ServerPasswordTestJSON-1057914730-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 695.032202] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f3215e6d-3a9e-40ac-a3f0-026d8e4d9b6e tempest-ServerPasswordTestJSON-1057914730 tempest-ServerPasswordTestJSON-1057914730-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 695.036467] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-f3215e6d-3a9e-40ac-a3f0-026d8e4d9b6e tempest-ServerPasswordTestJSON-1057914730 tempest-ServerPasswordTestJSON-1057914730-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 695.036467] nova-conductor[53040]: Traceback (most recent call last): [ 695.036467] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 695.036467] nova-conductor[53040]: return func(*args, **kwargs) [ 695.036467] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 695.036467] nova-conductor[53040]: selections = self._select_destinations( [ 695.036467] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 695.036467] nova-conductor[53040]: selections = self._schedule( [ 695.036467] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 695.036467] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 695.036467] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 695.036467] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 695.036467] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 695.036467] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 695.036467] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-f3215e6d-3a9e-40ac-a3f0-026d8e4d9b6e tempest-ServerPasswordTestJSON-1057914730 tempest-ServerPasswordTestJSON-1057914730-project-member] [instance: 5de30574-c0b5-4db3-9129-087d7dd990fc] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager [None req-219af57b-1458-408f-844c-0937b369b5c5 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 695.578999] nova-conductor[53039]: Traceback (most recent call last): [ 695.578999] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 695.578999] nova-conductor[53039]: return func(*args, **kwargs) [ 695.578999] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 695.578999] nova-conductor[53039]: selections = self._select_destinations( [ 695.578999] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 695.578999] nova-conductor[53039]: selections = self._schedule( [ 695.578999] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 695.578999] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 695.578999] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 695.578999] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 695.578999] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager [ 695.578999] nova-conductor[53039]: ERROR nova.conductor.manager [ 695.588714] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-219af57b-1458-408f-844c-0937b369b5c5 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 695.589137] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-219af57b-1458-408f-844c-0937b369b5c5 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 695.589699] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-219af57b-1458-408f-844c-0937b369b5c5 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 695.637033] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-219af57b-1458-408f-844c-0937b369b5c5 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 950891af-2c2c-4d15-b938-1e27fca10f9a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 695.637033] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-219af57b-1458-408f-844c-0937b369b5c5 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 695.637033] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-219af57b-1458-408f-844c-0937b369b5c5 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 695.637881] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-219af57b-1458-408f-844c-0937b369b5c5 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 695.640088] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-219af57b-1458-408f-844c-0937b369b5c5 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 695.640088] nova-conductor[53039]: Traceback (most recent call last): [ 695.640088] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 695.640088] nova-conductor[53039]: return func(*args, **kwargs) [ 695.640088] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 695.640088] nova-conductor[53039]: selections = self._select_destinations( [ 695.640088] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 695.640088] nova-conductor[53039]: selections = self._schedule( [ 695.640088] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 695.640088] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 695.640088] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 695.640088] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 695.640088] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 695.640088] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 695.640999] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-219af57b-1458-408f-844c-0937b369b5c5 tempest-DeleteServersAdminTestJSON-550626363 tempest-DeleteServersAdminTestJSON-550626363-project-member] [instance: 950891af-2c2c-4d15-b938-1e27fca10f9a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager [None req-0fd95366-1f70-4ca6-81d0-1af516a14bf4 tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 705.229071] nova-conductor[53039]: Traceback (most recent call last): [ 705.229071] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 705.229071] nova-conductor[53039]: return func(*args, **kwargs) [ 705.229071] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 705.229071] nova-conductor[53039]: selections = self._select_destinations( [ 705.229071] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 705.229071] nova-conductor[53039]: selections = self._schedule( [ 705.229071] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 705.229071] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 705.229071] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 705.229071] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 705.229071] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager [ 705.229071] nova-conductor[53039]: ERROR nova.conductor.manager [ 705.239791] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-0fd95366-1f70-4ca6-81d0-1af516a14bf4 tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.241996] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-0fd95366-1f70-4ca6-81d0-1af516a14bf4 tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.241996] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-0fd95366-1f70-4ca6-81d0-1af516a14bf4 tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.318763] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-0fd95366-1f70-4ca6-81d0-1af516a14bf4 tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] [instance: f2939eec-c498-4397-8104-7093e78f92f6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 705.319574] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-0fd95366-1f70-4ca6-81d0-1af516a14bf4 tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.320146] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-0fd95366-1f70-4ca6-81d0-1af516a14bf4 tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.320146] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-0fd95366-1f70-4ca6-81d0-1af516a14bf4 tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.325622] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-0fd95366-1f70-4ca6-81d0-1af516a14bf4 tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 705.325622] nova-conductor[53039]: Traceback (most recent call last): [ 705.325622] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 705.325622] nova-conductor[53039]: return func(*args, **kwargs) [ 705.325622] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 705.325622] nova-conductor[53039]: selections = self._select_destinations( [ 705.325622] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 705.325622] nova-conductor[53039]: selections = self._schedule( [ 705.325622] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 705.325622] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 705.325622] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 705.325622] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 705.325622] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 705.325622] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 705.325622] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-0fd95366-1f70-4ca6-81d0-1af516a14bf4 tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] [instance: f2939eec-c498-4397-8104-7093e78f92f6] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 705.361047] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-0fd95366-1f70-4ca6-81d0-1af516a14bf4 tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.361047] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-0fd95366-1f70-4ca6-81d0-1af516a14bf4 tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.361047] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-0fd95366-1f70-4ca6-81d0-1af516a14bf4 tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.417290] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-0fd95366-1f70-4ca6-81d0-1af516a14bf4 tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] [instance: 7f024bb9-899a-4f2b-bc8d-b3de8db8291d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 705.418433] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-0fd95366-1f70-4ca6-81d0-1af516a14bf4 tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.418712] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-0fd95366-1f70-4ca6-81d0-1af516a14bf4 tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.418964] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-0fd95366-1f70-4ca6-81d0-1af516a14bf4 tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.423279] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-0fd95366-1f70-4ca6-81d0-1af516a14bf4 tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 705.423279] nova-conductor[53039]: Traceback (most recent call last): [ 705.423279] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 705.423279] nova-conductor[53039]: return func(*args, **kwargs) [ 705.423279] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 705.423279] nova-conductor[53039]: selections = self._select_destinations( [ 705.423279] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 705.423279] nova-conductor[53039]: selections = self._schedule( [ 705.423279] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 705.423279] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 705.423279] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 705.423279] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 705.423279] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 705.423279] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 705.423862] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-0fd95366-1f70-4ca6-81d0-1af516a14bf4 tempest-MultipleCreateTestJSON-1222059822 tempest-MultipleCreateTestJSON-1222059822-project-member] [instance: 7f024bb9-899a-4f2b-bc8d-b3de8db8291d] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 708.914078] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=53039) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 708.930049] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 708.930279] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 708.930504] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 708.976926] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 708.978038] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 708.978038] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 708.978038] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 708.978218] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 708.978317] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 708.990023] nova-conductor[53039]: DEBUG nova.quota [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Getting quotas for project abdf2b93a0a241ae9fa1b395f41da87e. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 708.993429] nova-conductor[53039]: DEBUG nova.quota [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Getting quotas for user 53a288e38bec4962997348279606f1a0 and project abdf2b93a0a241ae9fa1b395f41da87e. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 709.006514] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53039) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 709.007085] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.007308] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.007482] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.011375] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] [instance: f202a181-b5ea-4b06-91ad-86356b51e088] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 709.012082] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.012297] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.012468] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.043744] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.044094] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.044280] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.740234] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Took 0.18 seconds to select destinations for 1 instance(s). {{(pid=53039) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 709.753452] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.753692] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.753867] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.789171] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.789171] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.789171] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.789527] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.789724] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.789884] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.798022] nova-conductor[53039]: DEBUG nova.quota [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Getting quotas for project fc9e7dc1863d455c98d44991ab5be2bc. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 709.800572] nova-conductor[53039]: DEBUG nova.quota [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Getting quotas for user 97786b94b75a41b8bbc3db750aa7b8d2 and project fc9e7dc1863d455c98d44991ab5be2bc. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 709.806383] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53039) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 709.806872] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.807097] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.807270] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.810234] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 63151ec9-f383-46cc-ac57-c3f7f1569410] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 709.810962] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.811134] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.811642] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.824735] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.824864] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.825053] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.897111] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=53040) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 714.915840] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.916076] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.917738] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.949496] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.949718] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.949790] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.950119] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.950305] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.950459] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.962070] nova-conductor[53040]: DEBUG nova.quota [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Getting quotas for project 5981d0de9a5545a4b2db5ab222672012. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 714.964251] nova-conductor[53040]: DEBUG nova.quota [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Getting quotas for user 9579a37d71414dae93da5b1490e44c86 and project 5981d0de9a5545a4b2db5ab222672012. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 714.972503] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53040) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 714.972503] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.972503] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.972503] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.975137] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] [instance: c5b391a9-7969-4119-9bc6-b0e1fe7a9713] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 714.976455] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.976665] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.976846] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.992789] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.993296] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.994287] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.003196] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Took 0.17 seconds to select destinations for 1 instance(s). {{(pid=53039) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 715.025626] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.025864] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.026065] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.062204] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.062692] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.063017] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.066313] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.066313] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.066313] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.085714] nova-conductor[53039]: DEBUG nova.quota [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Getting quotas for project 2140598201444851ab98084d07307c86. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 715.095512] nova-conductor[53039]: DEBUG nova.quota [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Getting quotas for user 0eaa3d12fe1b4a33b50f985d0fe081fa and project 2140598201444851ab98084d07307c86. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 715.102172] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53039) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 715.102770] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.102995] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.103503] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.110021] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] [instance: 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 715.110589] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.110772] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.111013] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.125623] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.125888] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.126140] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.389914] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=53039) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 716.405857] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.406653] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.406653] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.445968] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.446390] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.446575] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.446949] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.447146] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.447306] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.461255] nova-conductor[53039]: DEBUG nova.quota [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Getting quotas for project 4714528fb7fb41eb908a9bda448bdffc. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 716.465518] nova-conductor[53039]: DEBUG nova.quota [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Getting quotas for user e80ef72400df48b9b6a2b8b62fad4d5b and project 4714528fb7fb41eb908a9bda448bdffc. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 716.473357] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53039) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 716.473921] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.474148] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.474321] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.478532] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] [instance: 7476fb96-5247-472c-ab92-ef7e5916cb00] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 716.478975] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.479238] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.479416] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.491923] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.492297] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.492364] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.856153] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Took 0.17 seconds to select destinations for 1 instance(s). {{(pid=53040) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 718.877573] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 718.879851] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 718.879851] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.879987] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Took 0.18 seconds to select destinations for 1 instance(s). {{(pid=53039) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 718.895418] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 718.895612] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 718.895775] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.916608] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 718.916608] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 718.916779] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.917159] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 718.917342] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 718.917503] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.926705] nova-conductor[53040]: DEBUG nova.quota [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Getting quotas for project 64df375499704a52a28c8e3086612623. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 718.929940] nova-conductor[53040]: DEBUG nova.quota [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Getting quotas for user 34ead03015734f3eb4679cfd446be51c and project 64df375499704a52a28c8e3086612623. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 718.935094] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 718.935211] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 718.935385] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.935736] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 718.935922] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 718.936221] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.940695] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53040) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 718.942274] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 718.942274] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 718.942274] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.952046] nova-conductor[53039]: DEBUG nova.quota [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Getting quotas for project 6cd208d1e842468ea334e506728ad9d4. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 718.952530] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] [instance: 35630c7b-fdf4-4d6d-8e5a-0045f1387f93] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 718.953246] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 718.953506] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 718.953618] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.959986] nova-conductor[53039]: DEBUG nova.quota [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Getting quotas for user 19d32e6d286a48099a4a6b39cc21e5c3 and project 6cd208d1e842468ea334e506728ad9d4. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 718.966350] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53039) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 718.966866] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 718.967095] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 718.967278] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.968952] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 718.969072] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 718.969251] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.970633] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] [instance: 837197c0-9ff8-45a2-8bf0-730158a43a17] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 718.971314] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 718.971784] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 718.971986] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.988360] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 718.988577] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 718.990168] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.099254] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=53039) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 721.110573] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.110795] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.110977] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.157677] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.157949] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.158104] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.158464] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.158766] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.158931] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.169079] nova-conductor[53039]: DEBUG nova.quota [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Getting quotas for project cb84bdb9ac884f9e8082e8743a3ea695. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 721.171410] nova-conductor[53039]: DEBUG nova.quota [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Getting quotas for user 12e40bad222e48898a0e85cfcd0f8af3 and project cb84bdb9ac884f9e8082e8743a3ea695. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 721.179044] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] [instance: 56471a78-08cd-4d1a-b3f5-d1eac277183e] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53039) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 721.179044] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.179224] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.179357] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.182727] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] [instance: 56471a78-08cd-4d1a-b3f5-d1eac277183e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 721.183463] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.183663] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.183831] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.198687] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.198895] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.199213] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 725.572435] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=53040) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 725.585651] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 725.588017] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 725.588017] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 725.619611] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 725.619847] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 725.620027] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 725.620450] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 725.620664] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 725.620839] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 725.632619] nova-conductor[53040]: DEBUG nova.quota [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Getting quotas for project 01e66175cf394d0f8b898f3d74534759. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 725.634916] nova-conductor[53040]: DEBUG nova.quota [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Getting quotas for user 1b04e83c5794482a870b8020d2cab1f1 and project 01e66175cf394d0f8b898f3d74534759. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 725.641965] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] [instance: 1240824e-c5f1-4517-b182-20245311c687] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53040) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 725.642465] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 725.642701] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 725.642879] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 725.646257] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] [instance: 1240824e-c5f1-4517-b182-20245311c687] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 725.646946] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 725.647167] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 725.647334] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 725.663145] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 725.663471] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 725.663672] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 730.923756] nova-conductor[53040]: ERROR nova.scheduler.utils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 30c40353-01fe-407d-8d56-0f6c166d12e3 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 730.924917] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Rescheduling: True {{(pid=53040) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 730.924917] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 30c40353-01fe-407d-8d56-0f6c166d12e3.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 30c40353-01fe-407d-8d56-0f6c166d12e3. [ 730.925354] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 30c40353-01fe-407d-8d56-0f6c166d12e3. [ 730.984297] nova-conductor[53040]: DEBUG nova.network.neutron [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] deallocate_for_instance() {{(pid=53040) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 731.213223] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Took 0.22 seconds to select destinations for 1 instance(s). {{(pid=53039) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 731.231337] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.231557] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.231743] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.245604] nova-conductor[53040]: DEBUG nova.network.neutron [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Instance cache missing network info. {{(pid=53040) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 731.260597] nova-conductor[53040]: DEBUG nova.network.neutron [None req-a6b49098-f299-45d3-806a-7c484bdafc54 tempest-TenantUsagesTestJSON-830737900 tempest-TenantUsagesTestJSON-830737900-project-member] [instance: 30c40353-01fe-407d-8d56-0f6c166d12e3] Updating instance_info_cache with network_info: [] {{(pid=53040) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 731.293926] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.294247] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.296163] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.296163] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.296163] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.296163] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.306145] nova-conductor[53039]: DEBUG nova.quota [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Getting quotas for project 64b08879cd024c199dab5b1c85a4d192. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 731.309147] nova-conductor[53039]: DEBUG nova.quota [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Getting quotas for user 866f10c47bbc46a68f211484e269f661 and project 64b08879cd024c199dab5b1c85a4d192. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 731.319399] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] [instance: daf1f034-cac9-44a9-8fdd-0c4c2d8eaa84] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53039) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 731.320227] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.320921] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.320921] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.328501] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] [instance: daf1f034-cac9-44a9-8fdd-0c4c2d8eaa84] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 731.328974] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.329182] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.329376] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.349399] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.352362] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.352362] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.534082] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=53040) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 735.549831] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.550073] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.550247] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.591107] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.591107] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.591107] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.591107] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.591665] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.591665] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.607070] nova-conductor[53040]: DEBUG nova.quota [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Getting quotas for project b3917aae625b4cfd9a0ab45ad226a4cb. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 735.611172] nova-conductor[53040]: DEBUG nova.quota [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Getting quotas for user 584528a6112f44d0b8c78739e0573670 and project b3917aae625b4cfd9a0ab45ad226a4cb. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 735.622780] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53040) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 735.623398] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.623668] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.623936] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.628166] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] [instance: b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 735.628570] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.628776] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.628984] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.646820] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.646820] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.646950] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.780321] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Took 0.12 seconds to select destinations for 1 instance(s). {{(pid=53039) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 735.804400] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.804694] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.804980] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.841363] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.841587] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.841760] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.842127] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.842315] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.842474] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.855567] nova-conductor[53039]: DEBUG nova.quota [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Getting quotas for project 6bf7df9c6ad54bda94c69846bf31be3a. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 735.859464] nova-conductor[53039]: DEBUG nova.quota [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Getting quotas for user 19f8a18d3e31489f99b251a2b3d9dd27 and project 6bf7df9c6ad54bda94c69846bf31be3a. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 735.867954] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] [instance: 49aaf98b-945e-4c5d-8158-641b8650a8a7] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53039) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 735.868484] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.868838] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.869034] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.873537] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] [instance: 49aaf98b-945e-4c5d-8158-641b8650a8a7] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 735.874336] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.874545] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.874726] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.891524] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.891771] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.891983] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 737.294530] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Took 0.12 seconds to select destinations for 1 instance(s). {{(pid=53040) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 737.307508] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 737.307838] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 737.308052] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 737.337138] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 737.337362] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 737.337562] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 737.337915] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 737.338110] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 737.338271] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 737.352177] nova-conductor[53040]: DEBUG nova.quota [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Getting quotas for project 728702d8345e40369f0a0d76c06f9806. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 737.354333] nova-conductor[53040]: DEBUG nova.quota [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Getting quotas for user 50fe9abfdcdd48138d07bd4f43ebd2f4 and project 728702d8345e40369f0a0d76c06f9806. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 737.360182] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] [instance: cb7a8413-4414-4de6-8d4f-9ac4f1784f35] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53040) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 737.360682] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 737.360898] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 737.361079] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 737.363846] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] [instance: cb7a8413-4414-4de6-8d4f-9ac4f1784f35] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 737.364500] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 737.364698] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 737.364862] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 737.376993] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 737.377504] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 737.377504] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 740.980537] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=53039) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 740.993568] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.993804] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.993977] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.024181] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.024181] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 741.024181] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.024181] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.024181] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 741.024181] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.032510] nova-conductor[53039]: DEBUG nova.quota [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Getting quotas for project f210204a4754444fbabb6d15dd8d502f. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 741.034757] nova-conductor[53039]: DEBUG nova.quota [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Getting quotas for user 8c820efe058c49c9b48726a5376a6832 and project f210204a4754444fbabb6d15dd8d502f. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 741.040590] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] [instance: 01b62d6f-6718-45b4-8f67-cdb77c5f4bd0] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53039) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 741.041105] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.041310] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 741.041478] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.044856] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] [instance: 01b62d6f-6718-45b4-8f67-cdb77c5f4bd0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 741.045465] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.045667] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 741.045839] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.058758] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.059201] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 741.059320] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 779.161352] nova-conductor[53039]: ERROR nova.scheduler.utils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 130961ce-1e22-4320-abc9-30fc5f652be3 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 779.162018] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Rescheduling: True {{(pid=53039) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 779.162270] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 130961ce-1e22-4320-abc9-30fc5f652be3.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 130961ce-1e22-4320-abc9-30fc5f652be3. [ 779.162587] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 130961ce-1e22-4320-abc9-30fc5f652be3. [ 779.186718] nova-conductor[53039]: DEBUG nova.network.neutron [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] deallocate_for_instance() {{(pid=53039) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 779.208281] nova-conductor[53039]: DEBUG nova.network.neutron [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Instance cache missing network info. {{(pid=53039) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 779.210499] nova-conductor[53039]: DEBUG nova.network.neutron [None req-69f6c840-b833-4649-8b5d-d7616824cc4a tempest-FloatingIPsAssociationTestJSON-930375277 tempest-FloatingIPsAssociationTestJSON-930375277-project-member] [instance: 130961ce-1e22-4320-abc9-30fc5f652be3] Updating instance_info_cache with network_info: [] {{(pid=53039) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager [None req-f6e5b367-fa5f-498b-b38c-929821a3d339 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 788.356249] nova-conductor[53040]: Traceback (most recent call last): [ 788.356249] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 788.356249] nova-conductor[53040]: return func(*args, **kwargs) [ 788.356249] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 788.356249] nova-conductor[53040]: selections = self._select_destinations( [ 788.356249] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 788.356249] nova-conductor[53040]: selections = self._schedule( [ 788.356249] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 788.356249] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 788.356249] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 788.356249] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 788.356249] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager [ 788.356249] nova-conductor[53040]: ERROR nova.conductor.manager [ 788.363838] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f6e5b367-fa5f-498b-b38c-929821a3d339 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 788.364235] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f6e5b367-fa5f-498b-b38c-929821a3d339 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 788.364513] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f6e5b367-fa5f-498b-b38c-929821a3d339 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 788.417496] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-f6e5b367-fa5f-498b-b38c-929821a3d339 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] [instance: a8e0d6fc-d68b-431e-a74a-2bf6dba5c0c6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 788.420023] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f6e5b367-fa5f-498b-b38c-929821a3d339 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 788.420023] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f6e5b367-fa5f-498b-b38c-929821a3d339 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 788.420023] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-f6e5b367-fa5f-498b-b38c-929821a3d339 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 788.421959] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-f6e5b367-fa5f-498b-b38c-929821a3d339 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 788.421959] nova-conductor[53040]: Traceback (most recent call last): [ 788.421959] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 788.421959] nova-conductor[53040]: return func(*args, **kwargs) [ 788.421959] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 788.421959] nova-conductor[53040]: selections = self._select_destinations( [ 788.421959] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 788.421959] nova-conductor[53040]: selections = self._schedule( [ 788.421959] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 788.421959] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 788.421959] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 788.421959] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 788.421959] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 788.421959] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 788.423212] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-f6e5b367-fa5f-498b-b38c-929821a3d339 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] [instance: a8e0d6fc-d68b-431e-a74a-2bf6dba5c0c6] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager [None req-f2afcbf0-4d59-41e4-9c65-df9025bddaee tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 790.623088] nova-conductor[53039]: Traceback (most recent call last): [ 790.623088] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 790.623088] nova-conductor[53039]: return func(*args, **kwargs) [ 790.623088] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 790.623088] nova-conductor[53039]: selections = self._select_destinations( [ 790.623088] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 790.623088] nova-conductor[53039]: selections = self._schedule( [ 790.623088] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 790.623088] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 790.623088] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 790.623088] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 790.623088] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager [ 790.623088] nova-conductor[53039]: ERROR nova.conductor.manager [ 790.629684] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f2afcbf0-4d59-41e4-9c65-df9025bddaee tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 790.629949] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f2afcbf0-4d59-41e4-9c65-df9025bddaee tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 790.630191] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f2afcbf0-4d59-41e4-9c65-df9025bddaee tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 790.668986] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-f2afcbf0-4d59-41e4-9c65-df9025bddaee tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] [instance: 993ecf40-e9c3-4ba2-9d5c-82e8e8ddd7bd] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 790.669767] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f2afcbf0-4d59-41e4-9c65-df9025bddaee tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 790.670027] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f2afcbf0-4d59-41e4-9c65-df9025bddaee tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 790.670238] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-f2afcbf0-4d59-41e4-9c65-df9025bddaee tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 790.673589] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-f2afcbf0-4d59-41e4-9c65-df9025bddaee tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 790.673589] nova-conductor[53039]: Traceback (most recent call last): [ 790.673589] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 790.673589] nova-conductor[53039]: return func(*args, **kwargs) [ 790.673589] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 790.673589] nova-conductor[53039]: selections = self._select_destinations( [ 790.673589] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 790.673589] nova-conductor[53039]: selections = self._schedule( [ 790.673589] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 790.673589] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 790.673589] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 790.673589] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 790.673589] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 790.673589] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 790.674243] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-f2afcbf0-4d59-41e4-9c65-df9025bddaee tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] [instance: 993ecf40-e9c3-4ba2-9d5c-82e8e8ddd7bd] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager [None req-da6ca001-3076-414d-8603-f6c14cd5be77 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 792.251084] nova-conductor[53040]: Traceback (most recent call last): [ 792.251084] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 792.251084] nova-conductor[53040]: return func(*args, **kwargs) [ 792.251084] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 792.251084] nova-conductor[53040]: selections = self._select_destinations( [ 792.251084] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 792.251084] nova-conductor[53040]: selections = self._schedule( [ 792.251084] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 792.251084] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 792.251084] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 792.251084] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 792.251084] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager [ 792.251084] nova-conductor[53040]: ERROR nova.conductor.manager [ 792.257234] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-da6ca001-3076-414d-8603-f6c14cd5be77 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 792.257450] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-da6ca001-3076-414d-8603-f6c14cd5be77 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 792.257647] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-da6ca001-3076-414d-8603-f6c14cd5be77 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 792.297496] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-da6ca001-3076-414d-8603-f6c14cd5be77 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] [instance: c003d4d1-71eb-4c47-958b-cc4c3e7209ea] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 792.298192] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-da6ca001-3076-414d-8603-f6c14cd5be77 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 792.298400] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-da6ca001-3076-414d-8603-f6c14cd5be77 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 792.298571] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-da6ca001-3076-414d-8603-f6c14cd5be77 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 792.301570] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-da6ca001-3076-414d-8603-f6c14cd5be77 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 792.301570] nova-conductor[53040]: Traceback (most recent call last): [ 792.301570] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 792.301570] nova-conductor[53040]: return func(*args, **kwargs) [ 792.301570] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 792.301570] nova-conductor[53040]: selections = self._select_destinations( [ 792.301570] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 792.301570] nova-conductor[53040]: selections = self._schedule( [ 792.301570] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 792.301570] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 792.301570] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 792.301570] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 792.301570] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 792.301570] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 792.302124] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-da6ca001-3076-414d-8603-f6c14cd5be77 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] [instance: c003d4d1-71eb-4c47-958b-cc4c3e7209ea] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager [None req-14b89004-6098-4793-9d72-6749f0019dc3 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 794.511689] nova-conductor[53039]: Traceback (most recent call last): [ 794.511689] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 794.511689] nova-conductor[53039]: return func(*args, **kwargs) [ 794.511689] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 794.511689] nova-conductor[53039]: selections = self._select_destinations( [ 794.511689] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 794.511689] nova-conductor[53039]: selections = self._schedule( [ 794.511689] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 794.511689] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 794.511689] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 794.511689] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 794.511689] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager [ 794.511689] nova-conductor[53039]: ERROR nova.conductor.manager [ 794.519464] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-14b89004-6098-4793-9d72-6749f0019dc3 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 794.519464] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-14b89004-6098-4793-9d72-6749f0019dc3 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 794.519464] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-14b89004-6098-4793-9d72-6749f0019dc3 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 794.556871] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-14b89004-6098-4793-9d72-6749f0019dc3 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] [instance: 73e4e0b7-5fe8-4ee7-809d-30a2dd6dcbb1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 794.557428] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-14b89004-6098-4793-9d72-6749f0019dc3 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 794.558760] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-14b89004-6098-4793-9d72-6749f0019dc3 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 794.558760] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-14b89004-6098-4793-9d72-6749f0019dc3 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 794.560835] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-14b89004-6098-4793-9d72-6749f0019dc3 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 794.560835] nova-conductor[53039]: Traceback (most recent call last): [ 794.560835] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 794.560835] nova-conductor[53039]: return func(*args, **kwargs) [ 794.560835] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 794.560835] nova-conductor[53039]: selections = self._select_destinations( [ 794.560835] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 794.560835] nova-conductor[53039]: selections = self._schedule( [ 794.560835] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 794.560835] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 794.560835] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 794.560835] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 794.560835] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 794.560835] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 794.561354] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-14b89004-6098-4793-9d72-6749f0019dc3 tempest-ImagesTestJSON-52580012 tempest-ImagesTestJSON-52580012-project-member] [instance: 73e4e0b7-5fe8-4ee7-809d-30a2dd6dcbb1] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager [None req-6e683fad-4f3a-430d-8a4d-a59b12b6d38b tempest-ServersAaction247Test-1710864051 tempest-ServersAaction247Test-1710864051-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 799.565664] nova-conductor[53040]: Traceback (most recent call last): [ 799.565664] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 799.565664] nova-conductor[53040]: return func(*args, **kwargs) [ 799.565664] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 799.565664] nova-conductor[53040]: selections = self._select_destinations( [ 799.565664] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 799.565664] nova-conductor[53040]: selections = self._schedule( [ 799.565664] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 799.565664] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 799.565664] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 799.565664] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 799.565664] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager [ 799.565664] nova-conductor[53040]: ERROR nova.conductor.manager [ 799.572294] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6e683fad-4f3a-430d-8a4d-a59b12b6d38b tempest-ServersAaction247Test-1710864051 tempest-ServersAaction247Test-1710864051-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 799.572509] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6e683fad-4f3a-430d-8a4d-a59b12b6d38b tempest-ServersAaction247Test-1710864051 tempest-ServersAaction247Test-1710864051-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 799.572678] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6e683fad-4f3a-430d-8a4d-a59b12b6d38b tempest-ServersAaction247Test-1710864051 tempest-ServersAaction247Test-1710864051-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 799.613202] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-6e683fad-4f3a-430d-8a4d-a59b12b6d38b tempest-ServersAaction247Test-1710864051 tempest-ServersAaction247Test-1710864051-project-member] [instance: 3a5267f7-690e-4fa2-890d-d7491ab365b5] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 799.613872] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6e683fad-4f3a-430d-8a4d-a59b12b6d38b tempest-ServersAaction247Test-1710864051 tempest-ServersAaction247Test-1710864051-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 799.614091] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6e683fad-4f3a-430d-8a4d-a59b12b6d38b tempest-ServersAaction247Test-1710864051 tempest-ServersAaction247Test-1710864051-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 799.614272] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6e683fad-4f3a-430d-8a4d-a59b12b6d38b tempest-ServersAaction247Test-1710864051 tempest-ServersAaction247Test-1710864051-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 799.616905] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-6e683fad-4f3a-430d-8a4d-a59b12b6d38b tempest-ServersAaction247Test-1710864051 tempest-ServersAaction247Test-1710864051-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 799.616905] nova-conductor[53040]: Traceback (most recent call last): [ 799.616905] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 799.616905] nova-conductor[53040]: return func(*args, **kwargs) [ 799.616905] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 799.616905] nova-conductor[53040]: selections = self._select_destinations( [ 799.616905] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 799.616905] nova-conductor[53040]: selections = self._schedule( [ 799.616905] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 799.616905] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 799.616905] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 799.616905] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 799.616905] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 799.616905] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 799.617438] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-6e683fad-4f3a-430d-8a4d-a59b12b6d38b tempest-ServersAaction247Test-1710864051 tempest-ServersAaction247Test-1710864051-project-member] [instance: 3a5267f7-690e-4fa2-890d-d7491ab365b5] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager [None req-e96c8610-3a48-4598-aa08-8005c37c57c2 tempest-ServersNegativeTestJSON-954031117 tempest-ServersNegativeTestJSON-954031117-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 804.365392] nova-conductor[53039]: Traceback (most recent call last): [ 804.365392] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 804.365392] nova-conductor[53039]: return func(*args, **kwargs) [ 804.365392] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 804.365392] nova-conductor[53039]: selections = self._select_destinations( [ 804.365392] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 804.365392] nova-conductor[53039]: selections = self._schedule( [ 804.365392] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 804.365392] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 804.365392] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 804.365392] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 804.365392] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager [ 804.365392] nova-conductor[53039]: ERROR nova.conductor.manager [ 804.393635] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-e96c8610-3a48-4598-aa08-8005c37c57c2 tempest-ServersNegativeTestJSON-954031117 tempest-ServersNegativeTestJSON-954031117-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 804.393859] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-e96c8610-3a48-4598-aa08-8005c37c57c2 tempest-ServersNegativeTestJSON-954031117 tempest-ServersNegativeTestJSON-954031117-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 804.394043] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-e96c8610-3a48-4598-aa08-8005c37c57c2 tempest-ServersNegativeTestJSON-954031117 tempest-ServersNegativeTestJSON-954031117-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 804.433393] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-e96c8610-3a48-4598-aa08-8005c37c57c2 tempest-ServersNegativeTestJSON-954031117 tempest-ServersNegativeTestJSON-954031117-project-member] [instance: 1a4d0102-500e-494f-977e-322e51e79d95] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 804.435118] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-e96c8610-3a48-4598-aa08-8005c37c57c2 tempest-ServersNegativeTestJSON-954031117 tempest-ServersNegativeTestJSON-954031117-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 804.435118] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-e96c8610-3a48-4598-aa08-8005c37c57c2 tempest-ServersNegativeTestJSON-954031117 tempest-ServersNegativeTestJSON-954031117-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 804.435118] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-e96c8610-3a48-4598-aa08-8005c37c57c2 tempest-ServersNegativeTestJSON-954031117 tempest-ServersNegativeTestJSON-954031117-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 804.437880] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-e96c8610-3a48-4598-aa08-8005c37c57c2 tempest-ServersNegativeTestJSON-954031117 tempest-ServersNegativeTestJSON-954031117-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 804.437880] nova-conductor[53039]: Traceback (most recent call last): [ 804.437880] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 804.437880] nova-conductor[53039]: return func(*args, **kwargs) [ 804.437880] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 804.437880] nova-conductor[53039]: selections = self._select_destinations( [ 804.437880] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 804.437880] nova-conductor[53039]: selections = self._schedule( [ 804.437880] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 804.437880] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 804.437880] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 804.437880] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 804.437880] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 804.437880] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 804.438430] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-e96c8610-3a48-4598-aa08-8005c37c57c2 tempest-ServersNegativeTestJSON-954031117 tempest-ServersNegativeTestJSON-954031117-project-member] [instance: 1a4d0102-500e-494f-977e-322e51e79d95] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 828.235406] nova-conductor[53040]: ERROR nova.scheduler.utils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 4540cd82-440c-41e3-8bfa-b384da6fc964 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 828.238026] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Rescheduling: True {{(pid=53040) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 828.238026] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 4540cd82-440c-41e3-8bfa-b384da6fc964.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 4540cd82-440c-41e3-8bfa-b384da6fc964. [ 828.238026] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 4540cd82-440c-41e3-8bfa-b384da6fc964. [ 828.261436] nova-conductor[53040]: DEBUG nova.network.neutron [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] deallocate_for_instance() {{(pid=53040) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 828.284280] nova-conductor[53040]: DEBUG nova.network.neutron [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Instance cache missing network info. {{(pid=53040) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 828.288052] nova-conductor[53040]: DEBUG nova.network.neutron [None req-c4a6f749-208e-4e3a-8d68-f35f38a5ab33 tempest-ServerDiagnosticsNegativeTest-285185855 tempest-ServerDiagnosticsNegativeTest-285185855-project-member] [instance: 4540cd82-440c-41e3-8bfa-b384da6fc964] Updating instance_info_cache with network_info: [] {{(pid=53040) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 839.521297] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=53040) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 839.532988] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 839.533235] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 839.533407] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 839.575767] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 839.575994] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 839.576182] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 839.576546] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 839.576732] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 839.576918] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 839.584838] nova-conductor[53040]: DEBUG nova.quota [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Getting quotas for project bf64f2b352e24fe39bc883bbca0e091e. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 839.587287] nova-conductor[53040]: DEBUG nova.quota [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Getting quotas for user 3a776640ecd74bf6b1f54f2a84c1f44b and project bf64f2b352e24fe39bc883bbca0e091e. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 839.593338] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53040) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 839.593727] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 839.593963] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 839.594132] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 839.599745] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] [instance: 19881c50-a8ff-411f-b570-d4dc9ef3b0dc] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 839.599745] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 839.599745] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 839.599745] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 839.610808] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 839.611041] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 839.611215] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 879.035548] nova-conductor[53040]: ERROR nova.scheduler.utils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance a45f24ab-afe1-4ffd-a917-11b68a0b29ec was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 879.037146] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Rescheduling: True {{(pid=53040) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 879.037146] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance a45f24ab-afe1-4ffd-a917-11b68a0b29ec.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance a45f24ab-afe1-4ffd-a917-11b68a0b29ec. [ 879.037348] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance a45f24ab-afe1-4ffd-a917-11b68a0b29ec. [ 879.071152] nova-conductor[53040]: DEBUG nova.network.neutron [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] deallocate_for_instance() {{(pid=53040) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 879.090163] nova-conductor[53040]: DEBUG nova.network.neutron [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Instance cache missing network info. {{(pid=53040) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 879.092626] nova-conductor[53040]: DEBUG nova.network.neutron [None req-24f4a051-59a2-4c7d-a10e-f5dc554a9e1d tempest-ServerDiagnosticsTest-1589092301 tempest-ServerDiagnosticsTest-1589092301-project-member] [instance: a45f24ab-afe1-4ffd-a917-11b68a0b29ec] Updating instance_info_cache with network_info: [] {{(pid=53040) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 886.821418] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Took 0.12 seconds to select destinations for 1 instance(s). {{(pid=53040) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 886.833714] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 886.833947] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 886.834133] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 886.871521] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 886.871521] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 886.871521] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 886.871521] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 886.871521] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 886.871521] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 886.877633] nova-conductor[53040]: DEBUG nova.quota [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Getting quotas for project ff882dc15f1f43358391269a424d2893. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 886.880041] nova-conductor[53040]: DEBUG nova.quota [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Getting quotas for user 7d1a7310911a431db51d3733587adb20 and project ff882dc15f1f43358391269a424d2893. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 886.885402] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53040) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 886.885860] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 886.886032] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 886.886224] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 886.889070] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 886.889742] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 886.889941] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 886.890119] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 886.905104] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 886.905350] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 886.905517] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager [None req-d92aa39b-8c05-4472-81c7-67bc40731e20 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 910.059141] nova-conductor[53039]: Traceback (most recent call last): [ 910.059141] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 910.059141] nova-conductor[53039]: return func(*args, **kwargs) [ 910.059141] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 910.059141] nova-conductor[53039]: selections = self._select_destinations( [ 910.059141] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 910.059141] nova-conductor[53039]: selections = self._schedule( [ 910.059141] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 910.059141] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 910.059141] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 910.059141] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 910.059141] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager [ 910.059141] nova-conductor[53039]: ERROR nova.conductor.manager [ 910.064198] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-d92aa39b-8c05-4472-81c7-67bc40731e20 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 910.064422] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-d92aa39b-8c05-4472-81c7-67bc40731e20 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 910.064593] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-d92aa39b-8c05-4472-81c7-67bc40731e20 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 910.112216] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-d92aa39b-8c05-4472-81c7-67bc40731e20 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 9fd550e6-9589-4bb9-bcd2-0ef316185e24] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 910.112216] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-d92aa39b-8c05-4472-81c7-67bc40731e20 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 910.112216] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-d92aa39b-8c05-4472-81c7-67bc40731e20 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 910.112216] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-d92aa39b-8c05-4472-81c7-67bc40731e20 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 910.115300] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-d92aa39b-8c05-4472-81c7-67bc40731e20 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 910.115300] nova-conductor[53039]: Traceback (most recent call last): [ 910.115300] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 910.115300] nova-conductor[53039]: return func(*args, **kwargs) [ 910.115300] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 910.115300] nova-conductor[53039]: selections = self._select_destinations( [ 910.115300] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 910.115300] nova-conductor[53039]: selections = self._schedule( [ 910.115300] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 910.115300] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 910.115300] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 910.115300] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 910.115300] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 910.115300] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 910.115786] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-d92aa39b-8c05-4472-81c7-67bc40731e20 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] [instance: 9fd550e6-9589-4bb9-bcd2-0ef316185e24] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager [None req-89084f6f-fb2a-4134-b87e-23bf25ab8705 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 918.752374] nova-conductor[53040]: Traceback (most recent call last): [ 918.752374] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 918.752374] nova-conductor[53040]: return func(*args, **kwargs) [ 918.752374] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 918.752374] nova-conductor[53040]: selections = self._select_destinations( [ 918.752374] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 918.752374] nova-conductor[53040]: selections = self._schedule( [ 918.752374] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 918.752374] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 918.752374] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 918.752374] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 918.752374] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager [ 918.752374] nova-conductor[53040]: ERROR nova.conductor.manager [ 918.760204] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-89084f6f-fb2a-4134-b87e-23bf25ab8705 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 918.760442] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-89084f6f-fb2a-4134-b87e-23bf25ab8705 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 918.760616] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-89084f6f-fb2a-4134-b87e-23bf25ab8705 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 918.811874] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-89084f6f-fb2a-4134-b87e-23bf25ab8705 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] [instance: bf274d0d-f866-4789-a6ce-295d6722381c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 918.812665] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-89084f6f-fb2a-4134-b87e-23bf25ab8705 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 918.812879] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-89084f6f-fb2a-4134-b87e-23bf25ab8705 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 918.813062] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-89084f6f-fb2a-4134-b87e-23bf25ab8705 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 918.816694] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-89084f6f-fb2a-4134-b87e-23bf25ab8705 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 918.816694] nova-conductor[53040]: Traceback (most recent call last): [ 918.816694] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 918.816694] nova-conductor[53040]: return func(*args, **kwargs) [ 918.816694] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 918.816694] nova-conductor[53040]: selections = self._select_destinations( [ 918.816694] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 918.816694] nova-conductor[53040]: selections = self._schedule( [ 918.816694] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 918.816694] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 918.816694] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 918.816694] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 918.816694] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 918.816694] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 918.817354] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-89084f6f-fb2a-4134-b87e-23bf25ab8705 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] [instance: bf274d0d-f866-4789-a6ce-295d6722381c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager [None req-6b86d7f8-e5b4-48a1-b63f-ef3bdb6703fd tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 921.099549] nova-conductor[53039]: Traceback (most recent call last): [ 921.099549] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 921.099549] nova-conductor[53039]: return func(*args, **kwargs) [ 921.099549] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 921.099549] nova-conductor[53039]: selections = self._select_destinations( [ 921.099549] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 921.099549] nova-conductor[53039]: selections = self._schedule( [ 921.099549] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 921.099549] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 921.099549] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 921.099549] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 921.099549] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager [ 921.099549] nova-conductor[53039]: ERROR nova.conductor.manager [ 921.106919] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b86d7f8-e5b4-48a1-b63f-ef3bdb6703fd tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 921.107158] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b86d7f8-e5b4-48a1-b63f-ef3bdb6703fd tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 921.107330] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b86d7f8-e5b4-48a1-b63f-ef3bdb6703fd tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 921.161130] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-6b86d7f8-e5b4-48a1-b63f-ef3bdb6703fd tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] [instance: 2eff9915-37bb-4a67-8fbf-8f9d7cb20e9a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 921.162307] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b86d7f8-e5b4-48a1-b63f-ef3bdb6703fd tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 921.162508] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b86d7f8-e5b4-48a1-b63f-ef3bdb6703fd tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 921.163055] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6b86d7f8-e5b4-48a1-b63f-ef3bdb6703fd tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 921.172010] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-6b86d7f8-e5b4-48a1-b63f-ef3bdb6703fd tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 921.172010] nova-conductor[53039]: Traceback (most recent call last): [ 921.172010] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 921.172010] nova-conductor[53039]: return func(*args, **kwargs) [ 921.172010] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 921.172010] nova-conductor[53039]: selections = self._select_destinations( [ 921.172010] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 921.172010] nova-conductor[53039]: selections = self._schedule( [ 921.172010] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 921.172010] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 921.172010] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 921.172010] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 921.172010] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 921.172010] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 921.172589] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-6b86d7f8-e5b4-48a1-b63f-ef3bdb6703fd tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] [instance: 2eff9915-37bb-4a67-8fbf-8f9d7cb20e9a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager [None req-a14dbcea-f122-41b6-aa6c-84535d414af0 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 922.388446] nova-conductor[53040]: Traceback (most recent call last): [ 922.388446] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 922.388446] nova-conductor[53040]: return func(*args, **kwargs) [ 922.388446] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 922.388446] nova-conductor[53040]: selections = self._select_destinations( [ 922.388446] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 922.388446] nova-conductor[53040]: selections = self._schedule( [ 922.388446] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 922.388446] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 922.388446] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 922.388446] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 922.388446] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager [ 922.388446] nova-conductor[53040]: ERROR nova.conductor.manager [ 922.395692] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-a14dbcea-f122-41b6-aa6c-84535d414af0 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 922.395692] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-a14dbcea-f122-41b6-aa6c-84535d414af0 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 922.395692] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-a14dbcea-f122-41b6-aa6c-84535d414af0 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 922.439131] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-a14dbcea-f122-41b6-aa6c-84535d414af0 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] [instance: 97f63dea-99b0-462d-9840-7f0bc7539253] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 922.439881] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-a14dbcea-f122-41b6-aa6c-84535d414af0 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 922.440129] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-a14dbcea-f122-41b6-aa6c-84535d414af0 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 922.440322] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-a14dbcea-f122-41b6-aa6c-84535d414af0 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 922.443442] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-a14dbcea-f122-41b6-aa6c-84535d414af0 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 922.443442] nova-conductor[53040]: Traceback (most recent call last): [ 922.443442] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 922.443442] nova-conductor[53040]: return func(*args, **kwargs) [ 922.443442] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 922.443442] nova-conductor[53040]: selections = self._select_destinations( [ 922.443442] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 922.443442] nova-conductor[53040]: selections = self._schedule( [ 922.443442] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 922.443442] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 922.443442] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 922.443442] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 922.443442] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 922.443442] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 922.443948] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-a14dbcea-f122-41b6-aa6c-84535d414af0 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] [instance: 97f63dea-99b0-462d-9840-7f0bc7539253] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager [None req-2cfba475-889c-44e0-843f-f5ba1618ff43 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 925.312136] nova-conductor[53039]: Traceback (most recent call last): [ 925.312136] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 925.312136] nova-conductor[53039]: return func(*args, **kwargs) [ 925.312136] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 925.312136] nova-conductor[53039]: selections = self._select_destinations( [ 925.312136] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 925.312136] nova-conductor[53039]: selections = self._schedule( [ 925.312136] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 925.312136] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 925.312136] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 925.312136] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 925.312136] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager [ 925.312136] nova-conductor[53039]: ERROR nova.conductor.manager [ 925.322788] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2cfba475-889c-44e0-843f-f5ba1618ff43 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 925.323012] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2cfba475-889c-44e0-843f-f5ba1618ff43 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 925.323210] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2cfba475-889c-44e0-843f-f5ba1618ff43 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 925.376566] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-2cfba475-889c-44e0-843f-f5ba1618ff43 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] [instance: 991d5899-397e-4697-b189-c0f4efe06749] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 925.377302] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2cfba475-889c-44e0-843f-f5ba1618ff43 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 925.377961] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2cfba475-889c-44e0-843f-f5ba1618ff43 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 925.377961] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-2cfba475-889c-44e0-843f-f5ba1618ff43 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 925.381025] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-2cfba475-889c-44e0-843f-f5ba1618ff43 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 925.381025] nova-conductor[53039]: Traceback (most recent call last): [ 925.381025] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 925.381025] nova-conductor[53039]: return func(*args, **kwargs) [ 925.381025] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 925.381025] nova-conductor[53039]: selections = self._select_destinations( [ 925.381025] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 925.381025] nova-conductor[53039]: selections = self._schedule( [ 925.381025] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 925.381025] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 925.381025] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 925.381025] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 925.381025] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 925.381025] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 925.381926] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-2cfba475-889c-44e0-843f-f5ba1618ff43 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] [instance: 991d5899-397e-4697-b189-c0f4efe06749] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager [None req-497b6f5c-a7f1-45a3-93e4-f3007a741289 tempest-AttachVolumeShelveTestJSON-459987434 tempest-AttachVolumeShelveTestJSON-459987434-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 927.280020] nova-conductor[53040]: Traceback (most recent call last): [ 927.280020] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 927.280020] nova-conductor[53040]: return func(*args, **kwargs) [ 927.280020] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 927.280020] nova-conductor[53040]: selections = self._select_destinations( [ 927.280020] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 927.280020] nova-conductor[53040]: selections = self._schedule( [ 927.280020] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 927.280020] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 927.280020] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 927.280020] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 927.280020] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager [ 927.280020] nova-conductor[53040]: ERROR nova.conductor.manager [ 927.291203] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-497b6f5c-a7f1-45a3-93e4-f3007a741289 tempest-AttachVolumeShelveTestJSON-459987434 tempest-AttachVolumeShelveTestJSON-459987434-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 927.291203] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-497b6f5c-a7f1-45a3-93e4-f3007a741289 tempest-AttachVolumeShelveTestJSON-459987434 tempest-AttachVolumeShelveTestJSON-459987434-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 927.291203] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-497b6f5c-a7f1-45a3-93e4-f3007a741289 tempest-AttachVolumeShelveTestJSON-459987434 tempest-AttachVolumeShelveTestJSON-459987434-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 927.335410] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-497b6f5c-a7f1-45a3-93e4-f3007a741289 tempest-AttachVolumeShelveTestJSON-459987434 tempest-AttachVolumeShelveTestJSON-459987434-project-member] [instance: e47179d7-479b-460f-897e-c1cdb450f1ae] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 927.336107] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-497b6f5c-a7f1-45a3-93e4-f3007a741289 tempest-AttachVolumeShelveTestJSON-459987434 tempest-AttachVolumeShelveTestJSON-459987434-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 927.336420] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-497b6f5c-a7f1-45a3-93e4-f3007a741289 tempest-AttachVolumeShelveTestJSON-459987434 tempest-AttachVolumeShelveTestJSON-459987434-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 927.336506] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-497b6f5c-a7f1-45a3-93e4-f3007a741289 tempest-AttachVolumeShelveTestJSON-459987434 tempest-AttachVolumeShelveTestJSON-459987434-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 927.343023] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-497b6f5c-a7f1-45a3-93e4-f3007a741289 tempest-AttachVolumeShelveTestJSON-459987434 tempest-AttachVolumeShelveTestJSON-459987434-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 927.343023] nova-conductor[53040]: Traceback (most recent call last): [ 927.343023] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 927.343023] nova-conductor[53040]: return func(*args, **kwargs) [ 927.343023] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 927.343023] nova-conductor[53040]: selections = self._select_destinations( [ 927.343023] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 927.343023] nova-conductor[53040]: selections = self._schedule( [ 927.343023] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 927.343023] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 927.343023] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 927.343023] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 927.343023] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 927.343023] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 927.343023] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-497b6f5c-a7f1-45a3-93e4-f3007a741289 tempest-AttachVolumeShelveTestJSON-459987434 tempest-AttachVolumeShelveTestJSON-459987434-project-member] [instance: e47179d7-479b-460f-897e-c1cdb450f1ae] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager [None req-72d56085-e9c9-424b-959f-1ba195f6f3c5 tempest-ServersTestMultiNic-825955807 tempest-ServersTestMultiNic-825955807-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 927.705714] nova-conductor[53039]: Traceback (most recent call last): [ 927.705714] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 927.705714] nova-conductor[53039]: return func(*args, **kwargs) [ 927.705714] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 927.705714] nova-conductor[53039]: selections = self._select_destinations( [ 927.705714] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 927.705714] nova-conductor[53039]: selections = self._schedule( [ 927.705714] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 927.705714] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 927.705714] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 927.705714] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 927.705714] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager [ 927.705714] nova-conductor[53039]: ERROR nova.conductor.manager [ 927.718156] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-72d56085-e9c9-424b-959f-1ba195f6f3c5 tempest-ServersTestMultiNic-825955807 tempest-ServersTestMultiNic-825955807-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 927.718156] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-72d56085-e9c9-424b-959f-1ba195f6f3c5 tempest-ServersTestMultiNic-825955807 tempest-ServersTestMultiNic-825955807-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 927.718156] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-72d56085-e9c9-424b-959f-1ba195f6f3c5 tempest-ServersTestMultiNic-825955807 tempest-ServersTestMultiNic-825955807-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 927.776913] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-72d56085-e9c9-424b-959f-1ba195f6f3c5 tempest-ServersTestMultiNic-825955807 tempest-ServersTestMultiNic-825955807-project-member] [instance: dd37da0a-066c-4513-ac35-eaf6ced29db2] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 927.776913] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-72d56085-e9c9-424b-959f-1ba195f6f3c5 tempest-ServersTestMultiNic-825955807 tempest-ServersTestMultiNic-825955807-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 927.776913] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-72d56085-e9c9-424b-959f-1ba195f6f3c5 tempest-ServersTestMultiNic-825955807 tempest-ServersTestMultiNic-825955807-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 927.776913] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-72d56085-e9c9-424b-959f-1ba195f6f3c5 tempest-ServersTestMultiNic-825955807 tempest-ServersTestMultiNic-825955807-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 927.779179] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-72d56085-e9c9-424b-959f-1ba195f6f3c5 tempest-ServersTestMultiNic-825955807 tempest-ServersTestMultiNic-825955807-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 927.779179] nova-conductor[53039]: Traceback (most recent call last): [ 927.779179] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 927.779179] nova-conductor[53039]: return func(*args, **kwargs) [ 927.779179] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 927.779179] nova-conductor[53039]: selections = self._select_destinations( [ 927.779179] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 927.779179] nova-conductor[53039]: selections = self._schedule( [ 927.779179] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 927.779179] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 927.779179] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 927.779179] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 927.779179] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 927.779179] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 927.779722] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-72d56085-e9c9-424b-959f-1ba195f6f3c5 tempest-ServersTestMultiNic-825955807 tempest-ServersTestMultiNic-825955807-project-member] [instance: dd37da0a-066c-4513-ac35-eaf6ced29db2] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager [None req-250b3355-3099-449b-9d78-fad31feb5ee4 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 928.461912] nova-conductor[53039]: Traceback (most recent call last): [ 928.461912] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 928.461912] nova-conductor[53039]: return func(*args, **kwargs) [ 928.461912] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 928.461912] nova-conductor[53039]: selections = self._select_destinations( [ 928.461912] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 928.461912] nova-conductor[53039]: selections = self._schedule( [ 928.461912] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 928.461912] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 928.461912] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 928.461912] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 928.461912] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager [ 928.461912] nova-conductor[53039]: ERROR nova.conductor.manager [ 928.474071] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-250b3355-3099-449b-9d78-fad31feb5ee4 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 928.474260] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-250b3355-3099-449b-9d78-fad31feb5ee4 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 928.474427] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-250b3355-3099-449b-9d78-fad31feb5ee4 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 928.545815] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-250b3355-3099-449b-9d78-fad31feb5ee4 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] [instance: 14d6f3f3-bd46-4e39-ab08-acf8290c8808] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 928.546933] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-250b3355-3099-449b-9d78-fad31feb5ee4 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 928.547240] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-250b3355-3099-449b-9d78-fad31feb5ee4 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 928.547407] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-250b3355-3099-449b-9d78-fad31feb5ee4 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 928.571927] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-250b3355-3099-449b-9d78-fad31feb5ee4 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 928.571927] nova-conductor[53039]: Traceback (most recent call last): [ 928.571927] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 928.571927] nova-conductor[53039]: return func(*args, **kwargs) [ 928.571927] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 928.571927] nova-conductor[53039]: selections = self._select_destinations( [ 928.571927] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 928.571927] nova-conductor[53039]: selections = self._schedule( [ 928.571927] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 928.571927] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 928.571927] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 928.571927] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 928.571927] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 928.571927] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 928.572637] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-250b3355-3099-449b-9d78-fad31feb5ee4 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] [instance: 14d6f3f3-bd46-4e39-ab08-acf8290c8808] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 928.623587] nova-conductor[53040]: ERROR nova.scheduler.utils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance cc1d534d-6a43-4575-895d-c3bef84d772e was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 928.624555] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Rescheduling: True {{(pid=53040) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 928.624555] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance cc1d534d-6a43-4575-895d-c3bef84d772e.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance cc1d534d-6a43-4575-895d-c3bef84d772e. [ 928.624712] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance cc1d534d-6a43-4575-895d-c3bef84d772e. [ 928.654148] nova-conductor[53040]: DEBUG nova.network.neutron [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] deallocate_for_instance() {{(pid=53040) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 928.671032] nova-conductor[53040]: DEBUG nova.network.neutron [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Instance cache missing network info. {{(pid=53040) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 928.672290] nova-conductor[53039]: DEBUG nova.db.main.api [None req-94fe6431-1bb6-476e-9e72-7a5409337850 tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Created instance_extra for 56471a78-08cd-4d1a-b3f5-d1eac277183e {{(pid=53039) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 928.675322] nova-conductor[53040]: DEBUG nova.network.neutron [None req-6b807944-0c8c-4945-a94a-e83d6972ee79 tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: cc1d534d-6a43-4575-895d-c3bef84d772e] Updating instance_info_cache with network_info: [] {{(pid=53040) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 928.735058] nova-conductor[53039]: DEBUG nova.db.main.api [None req-d74c6efc-892d-47d0-bc6e-18c31d9337a2 tempest-ServerDiagnosticsV248Test-746765673 tempest-ServerDiagnosticsV248Test-746765673-project-member] Created instance_extra for 1240824e-c5f1-4517-b182-20245311c687 {{(pid=53039) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 928.820197] nova-conductor[53039]: DEBUG nova.db.main.api [None req-b4c28495-2ae6-4055-b784-b74f117db807 tempest-ServersV294TestFqdnHostnames-541059978 tempest-ServersV294TestFqdnHostnames-541059978-project-member] Created instance_extra for daf1f034-cac9-44a9-8fdd-0c4c2d8eaa84 {{(pid=53039) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager [None req-dfff668f-a86c-4533-b360-f7c910eff82b tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 929.680007] nova-conductor[53040]: Traceback (most recent call last): [ 929.680007] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 929.680007] nova-conductor[53040]: return func(*args, **kwargs) [ 929.680007] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 929.680007] nova-conductor[53040]: selections = self._select_destinations( [ 929.680007] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 929.680007] nova-conductor[53040]: selections = self._schedule( [ 929.680007] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 929.680007] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 929.680007] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 929.680007] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 929.680007] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager [ 929.680007] nova-conductor[53040]: ERROR nova.conductor.manager [ 929.686987] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-dfff668f-a86c-4533-b360-f7c910eff82b tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 929.687232] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-dfff668f-a86c-4533-b360-f7c910eff82b tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 929.687401] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-dfff668f-a86c-4533-b360-f7c910eff82b tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 929.733261] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-dfff668f-a86c-4533-b360-f7c910eff82b tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: 051ba276-b0ad-4895-bb4f-7d8b998fa81f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 929.733976] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-dfff668f-a86c-4533-b360-f7c910eff82b tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 929.734213] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-dfff668f-a86c-4533-b360-f7c910eff82b tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 929.734379] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-dfff668f-a86c-4533-b360-f7c910eff82b tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 929.738075] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-dfff668f-a86c-4533-b360-f7c910eff82b tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 929.738075] nova-conductor[53040]: Traceback (most recent call last): [ 929.738075] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 929.738075] nova-conductor[53040]: return func(*args, **kwargs) [ 929.738075] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 929.738075] nova-conductor[53040]: selections = self._select_destinations( [ 929.738075] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 929.738075] nova-conductor[53040]: selections = self._schedule( [ 929.738075] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 929.738075] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 929.738075] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 929.738075] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 929.738075] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 929.738075] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 929.738798] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-dfff668f-a86c-4533-b360-f7c910eff82b tempest-MigrationsAdminTest-620644018 tempest-MigrationsAdminTest-620644018-project-member] [instance: 051ba276-b0ad-4895-bb4f-7d8b998fa81f] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager [None req-3ff97f10-6e7a-4f71-b406-b22b271da97c tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 931.298976] nova-conductor[53039]: Traceback (most recent call last): [ 931.298976] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 931.298976] nova-conductor[53039]: return func(*args, **kwargs) [ 931.298976] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 931.298976] nova-conductor[53039]: selections = self._select_destinations( [ 931.298976] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 931.298976] nova-conductor[53039]: selections = self._schedule( [ 931.298976] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 931.298976] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 931.298976] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 931.298976] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 931.298976] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager [ 931.298976] nova-conductor[53039]: ERROR nova.conductor.manager [ 931.311023] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-3ff97f10-6e7a-4f71-b406-b22b271da97c tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 931.311023] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-3ff97f10-6e7a-4f71-b406-b22b271da97c tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 931.311023] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-3ff97f10-6e7a-4f71-b406-b22b271da97c tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 931.370625] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-3ff97f10-6e7a-4f71-b406-b22b271da97c tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] [instance: 169c39b4-b78f-4051-8025-87cc7be85e73] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 931.371653] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-3ff97f10-6e7a-4f71-b406-b22b271da97c tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 931.371864] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-3ff97f10-6e7a-4f71-b406-b22b271da97c tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 931.372083] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-3ff97f10-6e7a-4f71-b406-b22b271da97c tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 931.378367] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-3ff97f10-6e7a-4f71-b406-b22b271da97c tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 931.378367] nova-conductor[53039]: Traceback (most recent call last): [ 931.378367] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 931.378367] nova-conductor[53039]: return func(*args, **kwargs) [ 931.378367] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 931.378367] nova-conductor[53039]: selections = self._select_destinations( [ 931.378367] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 931.378367] nova-conductor[53039]: selections = self._schedule( [ 931.378367] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 931.378367] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 931.378367] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 931.378367] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 931.378367] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 931.378367] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 931.378894] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-3ff97f10-6e7a-4f71-b406-b22b271da97c tempest-DeleteServersTestJSON-1895503581 tempest-DeleteServersTestJSON-1895503581-project-member] [instance: 169c39b4-b78f-4051-8025-87cc7be85e73] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager [None req-e13fb1b4-50a9-4d85-9e65-d038a38c3055 tempest-AttachVolumeShelveTestJSON-459987434 tempest-AttachVolumeShelveTestJSON-459987434-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 932.750798] nova-conductor[53039]: Traceback (most recent call last): [ 932.750798] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 932.750798] nova-conductor[53039]: return func(*args, **kwargs) [ 932.750798] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 932.750798] nova-conductor[53039]: selections = self._select_destinations( [ 932.750798] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 932.750798] nova-conductor[53039]: selections = self._schedule( [ 932.750798] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 932.750798] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 932.750798] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 932.750798] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 932.750798] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager [ 932.750798] nova-conductor[53039]: ERROR nova.conductor.manager [ 932.757868] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-e13fb1b4-50a9-4d85-9e65-d038a38c3055 tempest-AttachVolumeShelveTestJSON-459987434 tempest-AttachVolumeShelveTestJSON-459987434-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 932.757868] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-e13fb1b4-50a9-4d85-9e65-d038a38c3055 tempest-AttachVolumeShelveTestJSON-459987434 tempest-AttachVolumeShelveTestJSON-459987434-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 932.757868] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-e13fb1b4-50a9-4d85-9e65-d038a38c3055 tempest-AttachVolumeShelveTestJSON-459987434 tempest-AttachVolumeShelveTestJSON-459987434-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 932.802807] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-e13fb1b4-50a9-4d85-9e65-d038a38c3055 tempest-AttachVolumeShelveTestJSON-459987434 tempest-AttachVolumeShelveTestJSON-459987434-project-member] [instance: 18cb3ab9-4875-45fe-9ee2-041ae14dabac] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 932.802807] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-e13fb1b4-50a9-4d85-9e65-d038a38c3055 tempest-AttachVolumeShelveTestJSON-459987434 tempest-AttachVolumeShelveTestJSON-459987434-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 932.802807] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-e13fb1b4-50a9-4d85-9e65-d038a38c3055 tempest-AttachVolumeShelveTestJSON-459987434 tempest-AttachVolumeShelveTestJSON-459987434-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 932.802807] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-e13fb1b4-50a9-4d85-9e65-d038a38c3055 tempest-AttachVolumeShelveTestJSON-459987434 tempest-AttachVolumeShelveTestJSON-459987434-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 932.807087] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-e13fb1b4-50a9-4d85-9e65-d038a38c3055 tempest-AttachVolumeShelveTestJSON-459987434 tempest-AttachVolumeShelveTestJSON-459987434-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 932.807087] nova-conductor[53039]: Traceback (most recent call last): [ 932.807087] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 932.807087] nova-conductor[53039]: return func(*args, **kwargs) [ 932.807087] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 932.807087] nova-conductor[53039]: selections = self._select_destinations( [ 932.807087] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 932.807087] nova-conductor[53039]: selections = self._schedule( [ 932.807087] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 932.807087] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 932.807087] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 932.807087] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 932.807087] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 932.807087] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 932.807634] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-e13fb1b4-50a9-4d85-9e65-d038a38c3055 tempest-AttachVolumeShelveTestJSON-459987434 tempest-AttachVolumeShelveTestJSON-459987434-project-member] [instance: 18cb3ab9-4875-45fe-9ee2-041ae14dabac] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager [None req-bfc9ea79-747b-4d88-bccc-a6473554ef36 tempest-ServersTestMultiNic-825955807 tempest-ServersTestMultiNic-825955807-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 933.118527] nova-conductor[53040]: Traceback (most recent call last): [ 933.118527] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 933.118527] nova-conductor[53040]: return func(*args, **kwargs) [ 933.118527] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 933.118527] nova-conductor[53040]: selections = self._select_destinations( [ 933.118527] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 933.118527] nova-conductor[53040]: selections = self._schedule( [ 933.118527] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 933.118527] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 933.118527] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 933.118527] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 933.118527] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager [ 933.118527] nova-conductor[53040]: ERROR nova.conductor.manager [ 933.125268] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bfc9ea79-747b-4d88-bccc-a6473554ef36 tempest-ServersTestMultiNic-825955807 tempest-ServersTestMultiNic-825955807-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 933.125493] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bfc9ea79-747b-4d88-bccc-a6473554ef36 tempest-ServersTestMultiNic-825955807 tempest-ServersTestMultiNic-825955807-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 933.125720] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bfc9ea79-747b-4d88-bccc-a6473554ef36 tempest-ServersTestMultiNic-825955807 tempest-ServersTestMultiNic-825955807-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 933.170958] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-bfc9ea79-747b-4d88-bccc-a6473554ef36 tempest-ServersTestMultiNic-825955807 tempest-ServersTestMultiNic-825955807-project-member] [instance: b2f955f3-1f2d-4a1a-a583-d77ac46193a6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 933.171726] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bfc9ea79-747b-4d88-bccc-a6473554ef36 tempest-ServersTestMultiNic-825955807 tempest-ServersTestMultiNic-825955807-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 933.171997] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bfc9ea79-747b-4d88-bccc-a6473554ef36 tempest-ServersTestMultiNic-825955807 tempest-ServersTestMultiNic-825955807-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 933.172118] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bfc9ea79-747b-4d88-bccc-a6473554ef36 tempest-ServersTestMultiNic-825955807 tempest-ServersTestMultiNic-825955807-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 933.175427] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-bfc9ea79-747b-4d88-bccc-a6473554ef36 tempest-ServersTestMultiNic-825955807 tempest-ServersTestMultiNic-825955807-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 933.175427] nova-conductor[53040]: Traceback (most recent call last): [ 933.175427] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 933.175427] nova-conductor[53040]: return func(*args, **kwargs) [ 933.175427] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 933.175427] nova-conductor[53040]: selections = self._select_destinations( [ 933.175427] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 933.175427] nova-conductor[53040]: selections = self._schedule( [ 933.175427] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 933.175427] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 933.175427] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 933.175427] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 933.175427] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 933.175427] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 933.175947] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-bfc9ea79-747b-4d88-bccc-a6473554ef36 tempest-ServersTestMultiNic-825955807 tempest-ServersTestMultiNic-825955807-project-member] [instance: b2f955f3-1f2d-4a1a-a583-d77ac46193a6] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager [None req-59c7098a-723c-4620-a86e-b670b84408d7 tempest-ServersAdminNegativeTestJSON-788745189 tempest-ServersAdminNegativeTestJSON-788745189-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 934.542169] nova-conductor[53039]: Traceback (most recent call last): [ 934.542169] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 934.542169] nova-conductor[53039]: return func(*args, **kwargs) [ 934.542169] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 934.542169] nova-conductor[53039]: selections = self._select_destinations( [ 934.542169] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 934.542169] nova-conductor[53039]: selections = self._schedule( [ 934.542169] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 934.542169] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 934.542169] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 934.542169] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 934.542169] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager [ 934.542169] nova-conductor[53039]: ERROR nova.conductor.manager [ 934.548355] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-59c7098a-723c-4620-a86e-b670b84408d7 tempest-ServersAdminNegativeTestJSON-788745189 tempest-ServersAdminNegativeTestJSON-788745189-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 934.548580] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-59c7098a-723c-4620-a86e-b670b84408d7 tempest-ServersAdminNegativeTestJSON-788745189 tempest-ServersAdminNegativeTestJSON-788745189-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 934.548743] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-59c7098a-723c-4620-a86e-b670b84408d7 tempest-ServersAdminNegativeTestJSON-788745189 tempest-ServersAdminNegativeTestJSON-788745189-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 934.603332] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-59c7098a-723c-4620-a86e-b670b84408d7 tempest-ServersAdminNegativeTestJSON-788745189 tempest-ServersAdminNegativeTestJSON-788745189-project-member] [instance: 52645255-a8f6-4934-abe7-c2f96bc04fad] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 934.604089] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-59c7098a-723c-4620-a86e-b670b84408d7 tempest-ServersAdminNegativeTestJSON-788745189 tempest-ServersAdminNegativeTestJSON-788745189-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 934.604800] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-59c7098a-723c-4620-a86e-b670b84408d7 tempest-ServersAdminNegativeTestJSON-788745189 tempest-ServersAdminNegativeTestJSON-788745189-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 934.605007] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-59c7098a-723c-4620-a86e-b670b84408d7 tempest-ServersAdminNegativeTestJSON-788745189 tempest-ServersAdminNegativeTestJSON-788745189-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 934.608081] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-59c7098a-723c-4620-a86e-b670b84408d7 tempest-ServersAdminNegativeTestJSON-788745189 tempest-ServersAdminNegativeTestJSON-788745189-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 934.608081] nova-conductor[53039]: Traceback (most recent call last): [ 934.608081] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 934.608081] nova-conductor[53039]: return func(*args, **kwargs) [ 934.608081] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 934.608081] nova-conductor[53039]: selections = self._select_destinations( [ 934.608081] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 934.608081] nova-conductor[53039]: selections = self._schedule( [ 934.608081] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 934.608081] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 934.608081] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 934.608081] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 934.608081] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 934.608081] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 934.608648] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-59c7098a-723c-4620-a86e-b670b84408d7 tempest-ServersAdminNegativeTestJSON-788745189 tempest-ServersAdminNegativeTestJSON-788745189-project-member] [instance: 52645255-a8f6-4934-abe7-c2f96bc04fad] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager [None req-6adf0b78-09b9-413a-8dfc-f960db4ba732 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 938.087528] nova-conductor[53040]: Traceback (most recent call last): [ 938.087528] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 938.087528] nova-conductor[53040]: return func(*args, **kwargs) [ 938.087528] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 938.087528] nova-conductor[53040]: selections = self._select_destinations( [ 938.087528] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 938.087528] nova-conductor[53040]: selections = self._schedule( [ 938.087528] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 938.087528] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 938.087528] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 938.087528] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 938.087528] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager [ 938.087528] nova-conductor[53040]: ERROR nova.conductor.manager [ 938.099233] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6adf0b78-09b9-413a-8dfc-f960db4ba732 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 938.100206] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6adf0b78-09b9-413a-8dfc-f960db4ba732 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 938.100997] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6adf0b78-09b9-413a-8dfc-f960db4ba732 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 938.149721] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-6adf0b78-09b9-413a-8dfc-f960db4ba732 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] [instance: c01ad677-201e-40d3-9c4e-db84e34899d9] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 938.151305] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6adf0b78-09b9-413a-8dfc-f960db4ba732 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 938.151305] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6adf0b78-09b9-413a-8dfc-f960db4ba732 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 938.151305] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6adf0b78-09b9-413a-8dfc-f960db4ba732 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 938.155223] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-6adf0b78-09b9-413a-8dfc-f960db4ba732 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 938.155223] nova-conductor[53040]: Traceback (most recent call last): [ 938.155223] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 938.155223] nova-conductor[53040]: return func(*args, **kwargs) [ 938.155223] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 938.155223] nova-conductor[53040]: selections = self._select_destinations( [ 938.155223] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 938.155223] nova-conductor[53040]: selections = self._schedule( [ 938.155223] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 938.155223] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 938.155223] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 938.155223] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 938.155223] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 938.155223] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 938.155645] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-6adf0b78-09b9-413a-8dfc-f960db4ba732 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] [instance: c01ad677-201e-40d3-9c4e-db84e34899d9] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager [None req-6edc534a-cf9c-4003-b4bd-805cd01d6e78 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 940.888358] nova-conductor[53039]: Traceback (most recent call last): [ 940.888358] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 940.888358] nova-conductor[53039]: return func(*args, **kwargs) [ 940.888358] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 940.888358] nova-conductor[53039]: selections = self._select_destinations( [ 940.888358] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 940.888358] nova-conductor[53039]: selections = self._schedule( [ 940.888358] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 940.888358] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 940.888358] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 940.888358] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 940.888358] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager [ 940.888358] nova-conductor[53039]: ERROR nova.conductor.manager [ 940.895813] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6edc534a-cf9c-4003-b4bd-805cd01d6e78 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 940.896045] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6edc534a-cf9c-4003-b4bd-805cd01d6e78 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 940.896221] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6edc534a-cf9c-4003-b4bd-805cd01d6e78 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 940.956886] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-6edc534a-cf9c-4003-b4bd-805cd01d6e78 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] [instance: fb2d61a8-d380-4296-b1d7-8c61e775e465] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 940.958705] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6edc534a-cf9c-4003-b4bd-805cd01d6e78 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 940.959056] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6edc534a-cf9c-4003-b4bd-805cd01d6e78 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 940.959543] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-6edc534a-cf9c-4003-b4bd-805cd01d6e78 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 940.964873] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-6edc534a-cf9c-4003-b4bd-805cd01d6e78 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 940.964873] nova-conductor[53039]: Traceback (most recent call last): [ 940.964873] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 940.964873] nova-conductor[53039]: return func(*args, **kwargs) [ 940.964873] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 940.964873] nova-conductor[53039]: selections = self._select_destinations( [ 940.964873] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 940.964873] nova-conductor[53039]: selections = self._schedule( [ 940.964873] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 940.964873] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 940.964873] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 940.964873] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 940.964873] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 940.964873] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 940.964873] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-6edc534a-cf9c-4003-b4bd-805cd01d6e78 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] [instance: fb2d61a8-d380-4296-b1d7-8c61e775e465] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager [None req-1277c37e-3d1e-4f61-b982-ae895636fb2c tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 943.404780] nova-conductor[53040]: Traceback (most recent call last): [ 943.404780] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 943.404780] nova-conductor[53040]: return func(*args, **kwargs) [ 943.404780] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 943.404780] nova-conductor[53040]: selections = self._select_destinations( [ 943.404780] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 943.404780] nova-conductor[53040]: selections = self._schedule( [ 943.404780] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 943.404780] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 943.404780] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 943.404780] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 943.404780] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager [ 943.404780] nova-conductor[53040]: ERROR nova.conductor.manager [ 943.414271] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-1277c37e-3d1e-4f61-b982-ae895636fb2c tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 943.414512] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-1277c37e-3d1e-4f61-b982-ae895636fb2c tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 943.414717] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-1277c37e-3d1e-4f61-b982-ae895636fb2c tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 943.475957] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-1277c37e-3d1e-4f61-b982-ae895636fb2c tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] [instance: 8cba964e-05b8-4781-9f5e-ca6c2d33f824] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 943.475957] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-1277c37e-3d1e-4f61-b982-ae895636fb2c tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 943.475957] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-1277c37e-3d1e-4f61-b982-ae895636fb2c tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 943.475957] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-1277c37e-3d1e-4f61-b982-ae895636fb2c tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 943.479654] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-1277c37e-3d1e-4f61-b982-ae895636fb2c tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 943.479654] nova-conductor[53040]: Traceback (most recent call last): [ 943.479654] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 943.479654] nova-conductor[53040]: return func(*args, **kwargs) [ 943.479654] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 943.479654] nova-conductor[53040]: selections = self._select_destinations( [ 943.479654] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 943.479654] nova-conductor[53040]: selections = self._schedule( [ 943.479654] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 943.479654] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 943.479654] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 943.479654] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 943.479654] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 943.479654] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 943.482163] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-1277c37e-3d1e-4f61-b982-ae895636fb2c tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] [instance: 8cba964e-05b8-4781-9f5e-ca6c2d33f824] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager [None req-e10a74b2-83bf-48d8-a781-607adc980fdf tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 944.230436] nova-conductor[53039]: Traceback (most recent call last): [ 944.230436] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 944.230436] nova-conductor[53039]: return func(*args, **kwargs) [ 944.230436] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 944.230436] nova-conductor[53039]: selections = self._select_destinations( [ 944.230436] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 944.230436] nova-conductor[53039]: selections = self._schedule( [ 944.230436] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 944.230436] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 944.230436] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 944.230436] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 944.230436] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager [ 944.230436] nova-conductor[53039]: ERROR nova.conductor.manager [ 944.244671] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-e10a74b2-83bf-48d8-a781-607adc980fdf tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 944.244906] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-e10a74b2-83bf-48d8-a781-607adc980fdf tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 944.245149] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-e10a74b2-83bf-48d8-a781-607adc980fdf tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 944.290607] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-e10a74b2-83bf-48d8-a781-607adc980fdf tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: 6432f12e-c265-4237-a867-166a57a109fe] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 944.291447] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-e10a74b2-83bf-48d8-a781-607adc980fdf tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 944.291730] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-e10a74b2-83bf-48d8-a781-607adc980fdf tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 944.291917] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-e10a74b2-83bf-48d8-a781-607adc980fdf tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 944.295160] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-e10a74b2-83bf-48d8-a781-607adc980fdf tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 944.295160] nova-conductor[53039]: Traceback (most recent call last): [ 944.295160] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 944.295160] nova-conductor[53039]: return func(*args, **kwargs) [ 944.295160] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 944.295160] nova-conductor[53039]: selections = self._select_destinations( [ 944.295160] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 944.295160] nova-conductor[53039]: selections = self._schedule( [ 944.295160] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 944.295160] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 944.295160] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 944.295160] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 944.295160] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 944.295160] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 944.295711] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-e10a74b2-83bf-48d8-a781-607adc980fdf tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: 6432f12e-c265-4237-a867-166a57a109fe] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager [None req-a53c243e-9f38-4e63-bc71-71aa9cd1a5fe tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 945.370981] nova-conductor[53040]: Traceback (most recent call last): [ 945.370981] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 945.370981] nova-conductor[53040]: return func(*args, **kwargs) [ 945.370981] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 945.370981] nova-conductor[53040]: selections = self._select_destinations( [ 945.370981] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 945.370981] nova-conductor[53040]: selections = self._schedule( [ 945.370981] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 945.370981] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 945.370981] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 945.370981] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 945.370981] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager [ 945.370981] nova-conductor[53040]: ERROR nova.conductor.manager [ 945.379886] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-a53c243e-9f38-4e63-bc71-71aa9cd1a5fe tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 945.380108] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-a53c243e-9f38-4e63-bc71-71aa9cd1a5fe tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 945.380288] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-a53c243e-9f38-4e63-bc71-71aa9cd1a5fe tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 945.444419] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-a53c243e-9f38-4e63-bc71-71aa9cd1a5fe tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] [instance: 9d0b5c30-4228-42f1-802a-41e175a83a8c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 945.444419] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-a53c243e-9f38-4e63-bc71-71aa9cd1a5fe tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 945.444419] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-a53c243e-9f38-4e63-bc71-71aa9cd1a5fe tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 945.444419] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-a53c243e-9f38-4e63-bc71-71aa9cd1a5fe tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 945.457479] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-a53c243e-9f38-4e63-bc71-71aa9cd1a5fe tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 945.457479] nova-conductor[53040]: Traceback (most recent call last): [ 945.457479] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 945.457479] nova-conductor[53040]: return func(*args, **kwargs) [ 945.457479] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 945.457479] nova-conductor[53040]: selections = self._select_destinations( [ 945.457479] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 945.457479] nova-conductor[53040]: selections = self._schedule( [ 945.457479] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 945.457479] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 945.457479] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 945.457479] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 945.457479] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 945.457479] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 945.458545] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-a53c243e-9f38-4e63-bc71-71aa9cd1a5fe tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] [instance: 9d0b5c30-4228-42f1-802a-41e175a83a8c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager [None req-9f2aaa17-a573-470a-bce1-469f2f45d45e tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 946.512533] nova-conductor[53040]: Traceback (most recent call last): [ 946.512533] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 946.512533] nova-conductor[53040]: return func(*args, **kwargs) [ 946.512533] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 946.512533] nova-conductor[53040]: selections = self._select_destinations( [ 946.512533] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 946.512533] nova-conductor[53040]: selections = self._schedule( [ 946.512533] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 946.512533] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 946.512533] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 946.512533] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 946.512533] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager [ 946.512533] nova-conductor[53040]: ERROR nova.conductor.manager [ 946.524788] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-9f2aaa17-a573-470a-bce1-469f2f45d45e tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 946.525031] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-9f2aaa17-a573-470a-bce1-469f2f45d45e tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 946.525962] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-9f2aaa17-a573-470a-bce1-469f2f45d45e tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 946.589043] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-9f2aaa17-a573-470a-bce1-469f2f45d45e tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: e155b852-ea2a-4c17-8308-3829681a8a16] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 946.589851] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-9f2aaa17-a573-470a-bce1-469f2f45d45e tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 946.590080] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-9f2aaa17-a573-470a-bce1-469f2f45d45e tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 946.590287] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-9f2aaa17-a573-470a-bce1-469f2f45d45e tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 946.593710] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-9f2aaa17-a573-470a-bce1-469f2f45d45e tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 946.593710] nova-conductor[53040]: Traceback (most recent call last): [ 946.593710] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 946.593710] nova-conductor[53040]: return func(*args, **kwargs) [ 946.593710] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 946.593710] nova-conductor[53040]: selections = self._select_destinations( [ 946.593710] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 946.593710] nova-conductor[53040]: selections = self._schedule( [ 946.593710] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 946.593710] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 946.593710] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 946.593710] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 946.593710] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 946.593710] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 946.594448] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-9f2aaa17-a573-470a-bce1-469f2f45d45e tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: e155b852-ea2a-4c17-8308-3829681a8a16] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 946.614998] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-9f2aaa17-a573-470a-bce1-469f2f45d45e tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 946.615255] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-9f2aaa17-a573-470a-bce1-469f2f45d45e tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 946.615426] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-9f2aaa17-a573-470a-bce1-469f2f45d45e tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager [None req-43eb21c9-5352-4f4c-850f-5eec3443d8a2 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 947.074869] nova-conductor[53039]: Traceback (most recent call last): [ 947.074869] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 947.074869] nova-conductor[53039]: return func(*args, **kwargs) [ 947.074869] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 947.074869] nova-conductor[53039]: selections = self._select_destinations( [ 947.074869] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 947.074869] nova-conductor[53039]: selections = self._schedule( [ 947.074869] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 947.074869] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 947.074869] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 947.074869] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 947.074869] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager [ 947.074869] nova-conductor[53039]: ERROR nova.conductor.manager [ 947.082161] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-43eb21c9-5352-4f4c-850f-5eec3443d8a2 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 947.082394] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-43eb21c9-5352-4f4c-850f-5eec3443d8a2 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 947.082645] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-43eb21c9-5352-4f4c-850f-5eec3443d8a2 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 947.134997] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-43eb21c9-5352-4f4c-850f-5eec3443d8a2 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: f71f47e8-7766-4989-87eb-9d2873caba9c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 947.135727] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-43eb21c9-5352-4f4c-850f-5eec3443d8a2 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 947.135950] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-43eb21c9-5352-4f4c-850f-5eec3443d8a2 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 947.136168] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-43eb21c9-5352-4f4c-850f-5eec3443d8a2 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 947.139143] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-43eb21c9-5352-4f4c-850f-5eec3443d8a2 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 947.139143] nova-conductor[53039]: Traceback (most recent call last): [ 947.139143] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 947.139143] nova-conductor[53039]: return func(*args, **kwargs) [ 947.139143] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 947.139143] nova-conductor[53039]: selections = self._select_destinations( [ 947.139143] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 947.139143] nova-conductor[53039]: selections = self._schedule( [ 947.139143] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 947.139143] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 947.139143] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 947.139143] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 947.139143] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 947.139143] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 947.139670] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-43eb21c9-5352-4f4c-850f-5eec3443d8a2 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: f71f47e8-7766-4989-87eb-9d2873caba9c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager [None req-3418ca53-4d13-485b-9a3f-f25fcca56526 tempest-InstanceActionsV221TestJSON-662584722 tempest-InstanceActionsV221TestJSON-662584722-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 947.601169] nova-conductor[53040]: Traceback (most recent call last): [ 947.601169] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 947.601169] nova-conductor[53040]: return func(*args, **kwargs) [ 947.601169] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 947.601169] nova-conductor[53040]: selections = self._select_destinations( [ 947.601169] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 947.601169] nova-conductor[53040]: selections = self._schedule( [ 947.601169] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 947.601169] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 947.601169] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 947.601169] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 947.601169] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager [ 947.601169] nova-conductor[53040]: ERROR nova.conductor.manager [ 947.607246] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-3418ca53-4d13-485b-9a3f-f25fcca56526 tempest-InstanceActionsV221TestJSON-662584722 tempest-InstanceActionsV221TestJSON-662584722-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 947.607897] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-3418ca53-4d13-485b-9a3f-f25fcca56526 tempest-InstanceActionsV221TestJSON-662584722 tempest-InstanceActionsV221TestJSON-662584722-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 947.607897] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-3418ca53-4d13-485b-9a3f-f25fcca56526 tempest-InstanceActionsV221TestJSON-662584722 tempest-InstanceActionsV221TestJSON-662584722-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 947.652141] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-3418ca53-4d13-485b-9a3f-f25fcca56526 tempest-InstanceActionsV221TestJSON-662584722 tempest-InstanceActionsV221TestJSON-662584722-project-member] [instance: 82fd048a-29e9-4f9b-8a98-183595a0d930] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 947.652940] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-3418ca53-4d13-485b-9a3f-f25fcca56526 tempest-InstanceActionsV221TestJSON-662584722 tempest-InstanceActionsV221TestJSON-662584722-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 947.653177] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-3418ca53-4d13-485b-9a3f-f25fcca56526 tempest-InstanceActionsV221TestJSON-662584722 tempest-InstanceActionsV221TestJSON-662584722-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 947.653352] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-3418ca53-4d13-485b-9a3f-f25fcca56526 tempest-InstanceActionsV221TestJSON-662584722 tempest-InstanceActionsV221TestJSON-662584722-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 947.656431] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-3418ca53-4d13-485b-9a3f-f25fcca56526 tempest-InstanceActionsV221TestJSON-662584722 tempest-InstanceActionsV221TestJSON-662584722-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 947.656431] nova-conductor[53040]: Traceback (most recent call last): [ 947.656431] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 947.656431] nova-conductor[53040]: return func(*args, **kwargs) [ 947.656431] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 947.656431] nova-conductor[53040]: selections = self._select_destinations( [ 947.656431] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 947.656431] nova-conductor[53040]: selections = self._schedule( [ 947.656431] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 947.656431] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 947.656431] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 947.656431] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 947.656431] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 947.656431] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 947.657119] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-3418ca53-4d13-485b-9a3f-f25fcca56526 tempest-InstanceActionsV221TestJSON-662584722 tempest-InstanceActionsV221TestJSON-662584722-project-member] [instance: 82fd048a-29e9-4f9b-8a98-183595a0d930] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 949.037937] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Took 0.12 seconds to select destinations for 1 instance(s). {{(pid=53040) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 949.050583] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 949.050960] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 949.051235] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 949.082045] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 949.082276] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 949.082446] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 949.082792] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 949.082977] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 949.083166] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 949.090981] nova-conductor[53040]: DEBUG nova.quota [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Getting quotas for project 6c15bcc07e0a4e4fa73b77d300814d00. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 949.093283] nova-conductor[53040]: DEBUG nova.quota [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Getting quotas for user 1c1392a0b6d441328b27291a96c7ad84 and project 6c15bcc07e0a4e4fa73b77d300814d00. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 949.098991] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53040) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 949.099402] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 949.099597] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 949.099803] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 949.102466] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b5ad6145-8bf0-4aed-951b-eb11dd87ed7d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 949.103126] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 949.103328] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 949.103497] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 949.115974] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 949.116238] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 949.116408] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 951.555132] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Took 0.23 seconds to select destinations for 1 instance(s). {{(pid=53039) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 951.565908] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 951.566154] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 951.566328] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 951.595650] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 951.595883] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 951.596115] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 951.596580] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 951.596743] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 951.596907] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 951.604728] nova-conductor[53039]: DEBUG nova.quota [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Getting quotas for project 2549d966f11047368e896d5354721163. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 951.607114] nova-conductor[53039]: DEBUG nova.quota [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Getting quotas for user 567d8aa5659348e789f00d70a47382a2 and project 2549d966f11047368e896d5354721163. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 951.612539] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53039) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 951.612937] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 951.613159] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 951.613343] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 951.615870] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: c6ee7d41-5522-4019-9da9-8503ec99e2b5] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 951.616515] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 951.616712] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 951.616942] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 951.628779] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 951.628779] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 951.628779] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 951.652080] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=53040) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 951.663052] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 951.663052] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 951.663052] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 951.687279] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 951.687554] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 951.687726] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 951.688086] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 951.688270] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 951.688427] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 951.696227] nova-conductor[53040]: DEBUG nova.quota [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Getting quotas for project 2549d966f11047368e896d5354721163. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 951.698578] nova-conductor[53040]: DEBUG nova.quota [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Getting quotas for user 567d8aa5659348e789f00d70a47382a2 and project 2549d966f11047368e896d5354721163. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 951.704147] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53040) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 951.704603] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 951.704799] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 951.704961] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 951.707618] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] [instance: d97a55c5-f248-482a-9986-212e84bdd0b0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 951.708241] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 951.708434] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 951.708596] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 951.721275] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 951.721483] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 951.721652] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 963.457027] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Took 0.11 seconds to select destinations for 1 instance(s). {{(pid=53039) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 963.468722] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 963.468955] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 963.469163] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 963.497622] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 963.497861] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 963.498047] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 963.498397] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 963.498578] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 963.498738] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 963.507048] nova-conductor[53039]: DEBUG nova.quota [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Getting quotas for project a1fb7769ccc2463094e0dd138a59226e. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 963.509296] nova-conductor[53039]: DEBUG nova.quota [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Getting quotas for user 349d330e9c374dbdab47582c51ca9168 and project a1fb7769ccc2463094e0dd138a59226e. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 963.514615] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53039) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 963.515076] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 963.515275] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 963.515445] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 963.518228] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] [instance: 311eb356-b844-4b1b-a0f0-ed7da6bb9f1d] block_device_mapping [BlockDeviceMapping(attachment_id=716255e7-fa9a-439e-85b7-828ce9acaee6,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='volume',device_name=None,device_type=None,disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id=None,instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='volume',tag=None,updated_at=,uuid=,volume_id='12c96912-3d03-4b7d-9d94-8ff71c6cc5d0',volume_size=1,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 963.518828] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 963.519035] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 963.519203] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 963.533122] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 963.533329] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 963.533499] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-136093ad-9777-4432-b06f-c4b00dd320dd tempest-ServersTestBootFromVolume-387843988 tempest-ServersTestBootFromVolume-387843988-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 977.326993] nova-conductor[53039]: ERROR nova.scheduler.utils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 65bf8cf0-825c-42d8-bd78-62a6277d29d7 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 977.327997] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Rescheduling: True {{(pid=53039) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 977.328176] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 65bf8cf0-825c-42d8-bd78-62a6277d29d7.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 65bf8cf0-825c-42d8-bd78-62a6277d29d7. [ 977.328681] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 65bf8cf0-825c-42d8-bd78-62a6277d29d7. [ 977.355665] nova-conductor[53039]: DEBUG nova.network.neutron [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] deallocate_for_instance() {{(pid=53039) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 977.373233] nova-conductor[53039]: DEBUG nova.network.neutron [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Instance cache missing network info. {{(pid=53039) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 977.376567] nova-conductor[53039]: DEBUG nova.network.neutron [None req-86d4b597-3073-4e37-8c9d-e9ea1f70e56e tempest-ServerExternalEventsTest-116321567 tempest-ServerExternalEventsTest-116321567-project-member] [instance: 65bf8cf0-825c-42d8-bd78-62a6277d29d7] Updating instance_info_cache with network_info: [] {{(pid=53039) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 977.401522] nova-conductor[53040]: DEBUG nova.db.main.api [None req-ef4b706f-4629-44f3-be39-af08c87a8497 tempest-ImagesOneServerTestJSON-749254656 tempest-ImagesOneServerTestJSON-749254656-project-member] Created instance_extra for 49aaf98b-945e-4c5d-8158-641b8650a8a7 {{(pid=53040) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 977.463473] nova-conductor[53040]: DEBUG nova.db.main.api [None req-bb0f1347-ba84-4ee0-b6e1-a08a8a353154 tempest-ServerTagsTestJSON-1401197038 tempest-ServerTagsTestJSON-1401197038-project-member] Created instance_extra for cb7a8413-4414-4de6-8d4f-9ac4f1784f35 {{(pid=53040) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 977.520234] nova-conductor[53039]: DEBUG nova.db.main.api [None req-f6e0da48-d31d-4d57-a3a8-7b964135e1c7 tempest-ServerDiskConfigTestJSON-1201106396 tempest-ServerDiskConfigTestJSON-1201106396-project-member] Created instance_extra for 01b62d6f-6718-45b4-8f67-cdb77c5f4bd0 {{(pid=53039) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 982.771804] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 982.772210] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 982.772431] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 982.913676] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Took 0.12 seconds to select destinations for 1 instance(s). {{(pid=53040) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 982.926228] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 982.926228] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 982.926228] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 982.951841] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 982.952298] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 982.952298] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 982.952613] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 982.952794] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 982.952948] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 982.961608] nova-conductor[53040]: DEBUG nova.quota [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Getting quotas for project 6d40db2e2f5c492f92f6943a058f1412. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 982.963971] nova-conductor[53040]: DEBUG nova.quota [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Getting quotas for user fe442deed75545e8a0c44706c74a99ff and project 6d40db2e2f5c492f92f6943a058f1412. Resources: {'instances', 'cores', 'ram'} {{(pid=53040) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 982.969427] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53040) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 982.969973] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 982.970194] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 982.970363] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 982.975881] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] [instance: e924a9ab-71c1-4efe-a217-b036ec785dc8] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 982.976573] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 982.976777] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 982.976945] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 982.991055] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 982.991307] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 982.991470] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1028.351402] nova-conductor[53040]: DEBUG nova.db.main.api [None req-a2fa7e35-1ec6-401c-8ac8-9902eb4011f9 tempest-AttachVolumeNegativeTest-209289787 tempest-AttachVolumeNegativeTest-209289787-project-member] Created instance_extra for 63151ec9-f383-46cc-ac57-c3f7f1569410 {{(pid=53040) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1028.782161] nova-conductor[53039]: ERROR nova.scheduler.utils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance e4f0342a-4169-40aa-b234-a2e2340d5b05 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 1028.782714] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Rescheduling: True {{(pid=53039) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 1028.782939] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance e4f0342a-4169-40aa-b234-a2e2340d5b05.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance e4f0342a-4169-40aa-b234-a2e2340d5b05. [ 1028.783170] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance e4f0342a-4169-40aa-b234-a2e2340d5b05. [ 1028.806267] nova-conductor[53039]: DEBUG nova.network.neutron [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] deallocate_for_instance() {{(pid=53039) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 1028.855042] nova-conductor[53039]: DEBUG nova.network.neutron [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Instance cache missing network info. {{(pid=53039) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 1028.862904] nova-conductor[53039]: DEBUG nova.network.neutron [None req-851dd7a4-ac55-40e7-8739-5848d9d1a183 tempest-ServerActionsTestOtherB-1863421808 tempest-ServerActionsTestOtherB-1863421808-project-member] [instance: e4f0342a-4169-40aa-b234-a2e2340d5b05] Updating instance_info_cache with network_info: [] {{(pid=53039) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1029.134375] nova-conductor[53040]: DEBUG nova.db.main.api [None req-fe91af8a-8fba-42b4-a11d-86b725f8d324 tempest-AttachInterfacesTestJSON-1960723181 tempest-AttachInterfacesTestJSON-1960723181-project-member] Created instance_extra for f202a181-b5ea-4b06-91ad-86356b51e088 {{(pid=53040) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1031.601284] nova-conductor[53039]: DEBUG nova.db.main.api [None req-a5a75d1c-a79f-4dd8-aa61-e901aa56caa2 tempest-SecurityGroupsTestJSON-1049045858 tempest-SecurityGroupsTestJSON-1049045858-project-member] Created instance_extra for 95f71b47-73c8-4a82-b806-f6f2ed9cdbb3 {{(pid=53039) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1031.635329] nova-conductor[53040]: DEBUG nova.db.main.api [None req-f7a1adb9-c4cf-45bc-b153-231a528a7e62 tempest-ServerAddressesNegativeTestJSON-1455610660 tempest-ServerAddressesNegativeTestJSON-1455610660-project-member] Created instance_extra for 7476fb96-5247-472c-ab92-ef7e5916cb00 {{(pid=53040) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1031.675483] nova-conductor[53039]: DEBUG nova.db.main.api [None req-32fc421e-8762-4d7b-a9f1-02f26540cfda tempest-ServersTestJSON-895806724 tempest-ServersTestJSON-895806724-project-member] Created instance_extra for c5b391a9-7969-4119-9bc6-b0e1fe7a9713 {{(pid=53039) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1032.877095] nova-conductor[53040]: DEBUG nova.db.main.api [None req-6cbba487-f263-4a56-ac67-4992b9ab7c51 tempest-ServerMetadataNegativeTestJSON-452243307 tempest-ServerMetadataNegativeTestJSON-452243307-project-member] Created instance_extra for 35630c7b-fdf4-4d6d-8e5a-0045f1387f93 {{(pid=53040) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1032.928429] nova-conductor[53039]: DEBUG nova.db.main.api [None req-fa825ce6-9ad4-453b-8cd9-56c310cbc466 tempest-ServerRescueTestJSON-743816201 tempest-ServerRescueTestJSON-743816201-project-member] Created instance_extra for 837197c0-9ff8-45a2-8bf0-730158a43a17 {{(pid=53039) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager [None req-22d88010-187b-4c3f-949e-30e5b1055e2b tempest-ServerRescueTestJSONUnderV235-1763884815 tempest-ServerRescueTestJSONUnderV235-1763884815-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1035.776331] nova-conductor[53039]: Traceback (most recent call last): [ 1035.776331] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1035.776331] nova-conductor[53039]: return func(*args, **kwargs) [ 1035.776331] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1035.776331] nova-conductor[53039]: selections = self._select_destinations( [ 1035.776331] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1035.776331] nova-conductor[53039]: selections = self._schedule( [ 1035.776331] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1035.776331] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 1035.776331] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1035.776331] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 1035.776331] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager [ 1035.776331] nova-conductor[53039]: ERROR nova.conductor.manager [ 1035.784090] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-22d88010-187b-4c3f-949e-30e5b1055e2b tempest-ServerRescueTestJSONUnderV235-1763884815 tempest-ServerRescueTestJSONUnderV235-1763884815-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1035.784405] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-22d88010-187b-4c3f-949e-30e5b1055e2b tempest-ServerRescueTestJSONUnderV235-1763884815 tempest-ServerRescueTestJSONUnderV235-1763884815-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1035.784498] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-22d88010-187b-4c3f-949e-30e5b1055e2b tempest-ServerRescueTestJSONUnderV235-1763884815 tempest-ServerRescueTestJSONUnderV235-1763884815-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1035.824743] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-22d88010-187b-4c3f-949e-30e5b1055e2b tempest-ServerRescueTestJSONUnderV235-1763884815 tempest-ServerRescueTestJSONUnderV235-1763884815-project-member] [instance: a429b005-fe11-43b6-a07a-d508c9a10ed4] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1035.825434] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-22d88010-187b-4c3f-949e-30e5b1055e2b tempest-ServerRescueTestJSONUnderV235-1763884815 tempest-ServerRescueTestJSONUnderV235-1763884815-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1035.825646] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-22d88010-187b-4c3f-949e-30e5b1055e2b tempest-ServerRescueTestJSONUnderV235-1763884815 tempest-ServerRescueTestJSONUnderV235-1763884815-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1035.825853] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-22d88010-187b-4c3f-949e-30e5b1055e2b tempest-ServerRescueTestJSONUnderV235-1763884815 tempest-ServerRescueTestJSONUnderV235-1763884815-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1035.828867] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-22d88010-187b-4c3f-949e-30e5b1055e2b tempest-ServerRescueTestJSONUnderV235-1763884815 tempest-ServerRescueTestJSONUnderV235-1763884815-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1035.828867] nova-conductor[53039]: Traceback (most recent call last): [ 1035.828867] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1035.828867] nova-conductor[53039]: return func(*args, **kwargs) [ 1035.828867] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1035.828867] nova-conductor[53039]: selections = self._select_destinations( [ 1035.828867] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1035.828867] nova-conductor[53039]: selections = self._schedule( [ 1035.828867] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1035.828867] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 1035.828867] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1035.828867] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 1035.828867] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1035.828867] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1035.829404] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-22d88010-187b-4c3f-949e-30e5b1055e2b tempest-ServerRescueTestJSONUnderV235-1763884815 tempest-ServerRescueTestJSONUnderV235-1763884815-project-member] [instance: a429b005-fe11-43b6-a07a-d508c9a10ed4] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager [None req-2175cf4f-037f-4449-bcb5-c3ea6836a983 tempest-ServerShowV254Test-330546274 tempest-ServerShowV254Test-330546274-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1039.549297] nova-conductor[53040]: Traceback (most recent call last): [ 1039.549297] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1039.549297] nova-conductor[53040]: return func(*args, **kwargs) [ 1039.549297] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1039.549297] nova-conductor[53040]: selections = self._select_destinations( [ 1039.549297] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1039.549297] nova-conductor[53040]: selections = self._schedule( [ 1039.549297] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1039.549297] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 1039.549297] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1039.549297] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 1039.549297] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager [ 1039.549297] nova-conductor[53040]: ERROR nova.conductor.manager [ 1039.556081] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-2175cf4f-037f-4449-bcb5-c3ea6836a983 tempest-ServerShowV254Test-330546274 tempest-ServerShowV254Test-330546274-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1039.556352] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-2175cf4f-037f-4449-bcb5-c3ea6836a983 tempest-ServerShowV254Test-330546274 tempest-ServerShowV254Test-330546274-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1039.556560] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-2175cf4f-037f-4449-bcb5-c3ea6836a983 tempest-ServerShowV254Test-330546274 tempest-ServerShowV254Test-330546274-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1039.595462] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-2175cf4f-037f-4449-bcb5-c3ea6836a983 tempest-ServerShowV254Test-330546274 tempest-ServerShowV254Test-330546274-project-member] [instance: 16c2dd48-f6ce-4b10-89c8-1d7039a5b71b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1039.596172] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-2175cf4f-037f-4449-bcb5-c3ea6836a983 tempest-ServerShowV254Test-330546274 tempest-ServerShowV254Test-330546274-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1039.596388] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-2175cf4f-037f-4449-bcb5-c3ea6836a983 tempest-ServerShowV254Test-330546274 tempest-ServerShowV254Test-330546274-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1039.596557] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-2175cf4f-037f-4449-bcb5-c3ea6836a983 tempest-ServerShowV254Test-330546274 tempest-ServerShowV254Test-330546274-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1039.599201] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-2175cf4f-037f-4449-bcb5-c3ea6836a983 tempest-ServerShowV254Test-330546274 tempest-ServerShowV254Test-330546274-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1039.599201] nova-conductor[53040]: Traceback (most recent call last): [ 1039.599201] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1039.599201] nova-conductor[53040]: return func(*args, **kwargs) [ 1039.599201] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1039.599201] nova-conductor[53040]: selections = self._select_destinations( [ 1039.599201] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1039.599201] nova-conductor[53040]: selections = self._schedule( [ 1039.599201] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1039.599201] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 1039.599201] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1039.599201] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 1039.599201] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1039.599201] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1039.599833] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-2175cf4f-037f-4449-bcb5-c3ea6836a983 tempest-ServerShowV254Test-330546274 tempest-ServerShowV254Test-330546274-project-member] [instance: 16c2dd48-f6ce-4b10-89c8-1d7039a5b71b] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager [None req-9b00ae4f-d8b5-4164-8f3d-63f8070cbb65 tempest-InstanceActionsTestJSON-85834196 tempest-InstanceActionsTestJSON-85834196-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1063.113833] nova-conductor[53039]: Traceback (most recent call last): [ 1063.113833] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1063.113833] nova-conductor[53039]: return func(*args, **kwargs) [ 1063.113833] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1063.113833] nova-conductor[53039]: selections = self._select_destinations( [ 1063.113833] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1063.113833] nova-conductor[53039]: selections = self._schedule( [ 1063.113833] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1063.113833] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 1063.113833] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1063.113833] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 1063.113833] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager [ 1063.113833] nova-conductor[53039]: ERROR nova.conductor.manager [ 1063.120411] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-9b00ae4f-d8b5-4164-8f3d-63f8070cbb65 tempest-InstanceActionsTestJSON-85834196 tempest-InstanceActionsTestJSON-85834196-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1063.120634] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-9b00ae4f-d8b5-4164-8f3d-63f8070cbb65 tempest-InstanceActionsTestJSON-85834196 tempest-InstanceActionsTestJSON-85834196-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1063.120805] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-9b00ae4f-d8b5-4164-8f3d-63f8070cbb65 tempest-InstanceActionsTestJSON-85834196 tempest-InstanceActionsTestJSON-85834196-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1063.160828] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-9b00ae4f-d8b5-4164-8f3d-63f8070cbb65 tempest-InstanceActionsTestJSON-85834196 tempest-InstanceActionsTestJSON-85834196-project-member] [instance: 327e66f0-a884-45a2-9bb6-581e53dc75cf] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1063.161556] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-9b00ae4f-d8b5-4164-8f3d-63f8070cbb65 tempest-InstanceActionsTestJSON-85834196 tempest-InstanceActionsTestJSON-85834196-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1063.161771] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-9b00ae4f-d8b5-4164-8f3d-63f8070cbb65 tempest-InstanceActionsTestJSON-85834196 tempest-InstanceActionsTestJSON-85834196-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1063.161942] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-9b00ae4f-d8b5-4164-8f3d-63f8070cbb65 tempest-InstanceActionsTestJSON-85834196 tempest-InstanceActionsTestJSON-85834196-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1063.164857] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-9b00ae4f-d8b5-4164-8f3d-63f8070cbb65 tempest-InstanceActionsTestJSON-85834196 tempest-InstanceActionsTestJSON-85834196-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1063.164857] nova-conductor[53039]: Traceback (most recent call last): [ 1063.164857] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1063.164857] nova-conductor[53039]: return func(*args, **kwargs) [ 1063.164857] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1063.164857] nova-conductor[53039]: selections = self._select_destinations( [ 1063.164857] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1063.164857] nova-conductor[53039]: selections = self._schedule( [ 1063.164857] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1063.164857] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 1063.164857] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1063.164857] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 1063.164857] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1063.164857] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1063.165414] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-9b00ae4f-d8b5-4164-8f3d-63f8070cbb65 tempest-InstanceActionsTestJSON-85834196 tempest-InstanceActionsTestJSON-85834196-project-member] [instance: 327e66f0-a884-45a2-9bb6-581e53dc75cf] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1085.508293] nova-conductor[53039]: DEBUG nova.db.main.api [None req-fabd4c09-225b-4eab-a60c-c4ab202262a4 tempest-ImagesOneServerNegativeTestJSON-855426412 tempest-ImagesOneServerNegativeTestJSON-855426412-project-member] Created instance_extra for b6cdc3c0-d5a6-4e9f-bd90-b38355c74c88 {{(pid=53039) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1131.569641] nova-conductor[53039]: DEBUG nova.db.main.api [None req-836c245e-34e4-4725-a8f7-488797972e2e tempest-ServerActionsTestJSON-973878157 tempest-ServerActionsTestJSON-973878157-project-member] Created instance_extra for 19881c50-a8ff-411f-b570-d4dc9ef3b0dc {{(pid=53039) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager [None req-d07ded15-b762-4e12-975b-d68abe8ad379 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1145.123242] nova-conductor[53040]: Traceback (most recent call last): [ 1145.123242] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1145.123242] nova-conductor[53040]: return func(*args, **kwargs) [ 1145.123242] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1145.123242] nova-conductor[53040]: selections = self._select_destinations( [ 1145.123242] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1145.123242] nova-conductor[53040]: selections = self._schedule( [ 1145.123242] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1145.123242] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 1145.123242] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1145.123242] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 1145.123242] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager [ 1145.123242] nova-conductor[53040]: ERROR nova.conductor.manager [ 1145.129923] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d07ded15-b762-4e12-975b-d68abe8ad379 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1145.130153] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d07ded15-b762-4e12-975b-d68abe8ad379 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1145.130323] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d07ded15-b762-4e12-975b-d68abe8ad379 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1145.167533] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-d07ded15-b762-4e12-975b-d68abe8ad379 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b3115175-7069-4f36-ba74-4a22abd5634c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1145.168229] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d07ded15-b762-4e12-975b-d68abe8ad379 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1145.168435] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d07ded15-b762-4e12-975b-d68abe8ad379 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1145.168603] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-d07ded15-b762-4e12-975b-d68abe8ad379 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1145.171445] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-d07ded15-b762-4e12-975b-d68abe8ad379 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1145.171445] nova-conductor[53040]: Traceback (most recent call last): [ 1145.171445] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1145.171445] nova-conductor[53040]: return func(*args, **kwargs) [ 1145.171445] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1145.171445] nova-conductor[53040]: selections = self._select_destinations( [ 1145.171445] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1145.171445] nova-conductor[53040]: selections = self._schedule( [ 1145.171445] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1145.171445] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 1145.171445] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1145.171445] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 1145.171445] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1145.171445] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1145.172009] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-d07ded15-b762-4e12-975b-d68abe8ad379 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b3115175-7069-4f36-ba74-4a22abd5634c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager [None req-62168fc1-06bb-426c-a913-935f89cd12f5 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1146.931613] nova-conductor[53039]: Traceback (most recent call last): [ 1146.931613] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1146.931613] nova-conductor[53039]: return func(*args, **kwargs) [ 1146.931613] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1146.931613] nova-conductor[53039]: selections = self._select_destinations( [ 1146.931613] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1146.931613] nova-conductor[53039]: selections = self._schedule( [ 1146.931613] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1146.931613] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 1146.931613] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1146.931613] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 1146.931613] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager [ 1146.931613] nova-conductor[53039]: ERROR nova.conductor.manager [ 1146.937735] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-62168fc1-06bb-426c-a913-935f89cd12f5 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1146.937973] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-62168fc1-06bb-426c-a913-935f89cd12f5 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1146.938158] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-62168fc1-06bb-426c-a913-935f89cd12f5 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1146.976169] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-62168fc1-06bb-426c-a913-935f89cd12f5 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: 4a3e9546-f0e1-46ba-b1a5-6db5be881b22] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1146.976858] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-62168fc1-06bb-426c-a913-935f89cd12f5 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1146.977092] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-62168fc1-06bb-426c-a913-935f89cd12f5 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1146.977267] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-62168fc1-06bb-426c-a913-935f89cd12f5 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1146.983723] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-62168fc1-06bb-426c-a913-935f89cd12f5 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1146.983723] nova-conductor[53039]: Traceback (most recent call last): [ 1146.983723] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1146.983723] nova-conductor[53039]: return func(*args, **kwargs) [ 1146.983723] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1146.983723] nova-conductor[53039]: selections = self._select_destinations( [ 1146.983723] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1146.983723] nova-conductor[53039]: selections = self._schedule( [ 1146.983723] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1146.983723] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 1146.983723] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1146.983723] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 1146.983723] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1146.983723] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1146.984527] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-62168fc1-06bb-426c-a913-935f89cd12f5 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: 4a3e9546-f0e1-46ba-b1a5-6db5be881b22] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager [None req-c6637ec6-9a88-4a06-b3ff-87389f626ce6 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1148.607129] nova-conductor[53040]: Traceback (most recent call last): [ 1148.607129] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1148.607129] nova-conductor[53040]: return func(*args, **kwargs) [ 1148.607129] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1148.607129] nova-conductor[53040]: selections = self._select_destinations( [ 1148.607129] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1148.607129] nova-conductor[53040]: selections = self._schedule( [ 1148.607129] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1148.607129] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 1148.607129] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1148.607129] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 1148.607129] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager [ 1148.607129] nova-conductor[53040]: ERROR nova.conductor.manager [ 1148.613976] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-c6637ec6-9a88-4a06-b3ff-87389f626ce6 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1148.614226] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-c6637ec6-9a88-4a06-b3ff-87389f626ce6 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1148.614397] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-c6637ec6-9a88-4a06-b3ff-87389f626ce6 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1148.657199] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-c6637ec6-9a88-4a06-b3ff-87389f626ce6 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b702411c-4f1e-467b-acdf-fbbc5217b284] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1148.657919] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-c6637ec6-9a88-4a06-b3ff-87389f626ce6 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1148.658146] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-c6637ec6-9a88-4a06-b3ff-87389f626ce6 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1148.658318] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-c6637ec6-9a88-4a06-b3ff-87389f626ce6 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1148.661205] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-c6637ec6-9a88-4a06-b3ff-87389f626ce6 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1148.661205] nova-conductor[53040]: Traceback (most recent call last): [ 1148.661205] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1148.661205] nova-conductor[53040]: return func(*args, **kwargs) [ 1148.661205] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1148.661205] nova-conductor[53040]: selections = self._select_destinations( [ 1148.661205] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1148.661205] nova-conductor[53040]: selections = self._schedule( [ 1148.661205] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1148.661205] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 1148.661205] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1148.661205] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 1148.661205] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1148.661205] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1148.661759] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-c6637ec6-9a88-4a06-b3ff-87389f626ce6 tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] [instance: b702411c-4f1e-467b-acdf-fbbc5217b284] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager [None req-4802071c-8d79-4e1c-bd4d-80bde42a0775 tempest-AttachVolumeTestJSON-862023644 tempest-AttachVolumeTestJSON-862023644-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1155.143518] nova-conductor[53039]: Traceback (most recent call last): [ 1155.143518] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1155.143518] nova-conductor[53039]: return func(*args, **kwargs) [ 1155.143518] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1155.143518] nova-conductor[53039]: selections = self._select_destinations( [ 1155.143518] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1155.143518] nova-conductor[53039]: selections = self._schedule( [ 1155.143518] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1155.143518] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 1155.143518] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1155.143518] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 1155.143518] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager [ 1155.143518] nova-conductor[53039]: ERROR nova.conductor.manager [ 1155.150863] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-4802071c-8d79-4e1c-bd4d-80bde42a0775 tempest-AttachVolumeTestJSON-862023644 tempest-AttachVolumeTestJSON-862023644-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1155.151100] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-4802071c-8d79-4e1c-bd4d-80bde42a0775 tempest-AttachVolumeTestJSON-862023644 tempest-AttachVolumeTestJSON-862023644-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1155.151277] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-4802071c-8d79-4e1c-bd4d-80bde42a0775 tempest-AttachVolumeTestJSON-862023644 tempest-AttachVolumeTestJSON-862023644-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1155.194959] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-4802071c-8d79-4e1c-bd4d-80bde42a0775 tempest-AttachVolumeTestJSON-862023644 tempest-AttachVolumeTestJSON-862023644-project-member] [instance: b86d2634-94f7-4baa-b597-75a29a89de0d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1155.195639] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-4802071c-8d79-4e1c-bd4d-80bde42a0775 tempest-AttachVolumeTestJSON-862023644 tempest-AttachVolumeTestJSON-862023644-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1155.195849] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-4802071c-8d79-4e1c-bd4d-80bde42a0775 tempest-AttachVolumeTestJSON-862023644 tempest-AttachVolumeTestJSON-862023644-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1155.196031] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-4802071c-8d79-4e1c-bd4d-80bde42a0775 tempest-AttachVolumeTestJSON-862023644 tempest-AttachVolumeTestJSON-862023644-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1155.198856] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-4802071c-8d79-4e1c-bd4d-80bde42a0775 tempest-AttachVolumeTestJSON-862023644 tempest-AttachVolumeTestJSON-862023644-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1155.198856] nova-conductor[53039]: Traceback (most recent call last): [ 1155.198856] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1155.198856] nova-conductor[53039]: return func(*args, **kwargs) [ 1155.198856] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1155.198856] nova-conductor[53039]: selections = self._select_destinations( [ 1155.198856] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1155.198856] nova-conductor[53039]: selections = self._schedule( [ 1155.198856] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1155.198856] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 1155.198856] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1155.198856] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 1155.198856] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1155.198856] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1155.199382] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-4802071c-8d79-4e1c-bd4d-80bde42a0775 tempest-AttachVolumeTestJSON-862023644 tempest-AttachVolumeTestJSON-862023644-project-member] [instance: b86d2634-94f7-4baa-b597-75a29a89de0d] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager [None req-6d46eac3-dede-41be-830f-9d5d3016dbc3 tempest-AttachVolumeTestJSON-862023644 tempest-AttachVolumeTestJSON-862023644-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1157.617386] nova-conductor[53040]: Traceback (most recent call last): [ 1157.617386] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1157.617386] nova-conductor[53040]: return func(*args, **kwargs) [ 1157.617386] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1157.617386] nova-conductor[53040]: selections = self._select_destinations( [ 1157.617386] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1157.617386] nova-conductor[53040]: selections = self._schedule( [ 1157.617386] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1157.617386] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 1157.617386] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1157.617386] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 1157.617386] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager [ 1157.617386] nova-conductor[53040]: ERROR nova.conductor.manager [ 1157.624247] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6d46eac3-dede-41be-830f-9d5d3016dbc3 tempest-AttachVolumeTestJSON-862023644 tempest-AttachVolumeTestJSON-862023644-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1157.624469] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6d46eac3-dede-41be-830f-9d5d3016dbc3 tempest-AttachVolumeTestJSON-862023644 tempest-AttachVolumeTestJSON-862023644-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1157.624737] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6d46eac3-dede-41be-830f-9d5d3016dbc3 tempest-AttachVolumeTestJSON-862023644 tempest-AttachVolumeTestJSON-862023644-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1157.664369] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-6d46eac3-dede-41be-830f-9d5d3016dbc3 tempest-AttachVolumeTestJSON-862023644 tempest-AttachVolumeTestJSON-862023644-project-member] [instance: c9fa757a-be92-4daa-9c5d-2f5a3269cff6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1157.665161] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6d46eac3-dede-41be-830f-9d5d3016dbc3 tempest-AttachVolumeTestJSON-862023644 tempest-AttachVolumeTestJSON-862023644-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1157.665375] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6d46eac3-dede-41be-830f-9d5d3016dbc3 tempest-AttachVolumeTestJSON-862023644 tempest-AttachVolumeTestJSON-862023644-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1157.665545] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-6d46eac3-dede-41be-830f-9d5d3016dbc3 tempest-AttachVolumeTestJSON-862023644 tempest-AttachVolumeTestJSON-862023644-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1157.668841] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-6d46eac3-dede-41be-830f-9d5d3016dbc3 tempest-AttachVolumeTestJSON-862023644 tempest-AttachVolumeTestJSON-862023644-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1157.668841] nova-conductor[53040]: Traceback (most recent call last): [ 1157.668841] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1157.668841] nova-conductor[53040]: return func(*args, **kwargs) [ 1157.668841] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1157.668841] nova-conductor[53040]: selections = self._select_destinations( [ 1157.668841] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1157.668841] nova-conductor[53040]: selections = self._schedule( [ 1157.668841] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1157.668841] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 1157.668841] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1157.668841] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 1157.668841] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1157.668841] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1157.669370] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-6d46eac3-dede-41be-830f-9d5d3016dbc3 tempest-AttachVolumeTestJSON-862023644 tempest-AttachVolumeTestJSON-862023644-project-member] [instance: c9fa757a-be92-4daa-9c5d-2f5a3269cff6] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1177.879301] nova-conductor[53039]: DEBUG nova.db.main.api [None req-1f531249-ae54-4a41-a4d7-9533b7be56ed tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Created instance_extra for c6ee7d41-5522-4019-9da9-8503ec99e2b5 {{(pid=53039) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager [None req-7da890ac-f4db-4458-a510-c750d0760102 tempest-ServerRescueNegativeTestJSON-289200803 tempest-ServerRescueNegativeTestJSON-289200803-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1183.733383] nova-conductor[53039]: Traceback (most recent call last): [ 1183.733383] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1183.733383] nova-conductor[53039]: return func(*args, **kwargs) [ 1183.733383] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1183.733383] nova-conductor[53039]: selections = self._select_destinations( [ 1183.733383] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1183.733383] nova-conductor[53039]: selections = self._schedule( [ 1183.733383] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1183.733383] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 1183.733383] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1183.733383] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 1183.733383] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager [ 1183.733383] nova-conductor[53039]: ERROR nova.conductor.manager [ 1183.740094] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-7da890ac-f4db-4458-a510-c750d0760102 tempest-ServerRescueNegativeTestJSON-289200803 tempest-ServerRescueNegativeTestJSON-289200803-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1183.740307] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-7da890ac-f4db-4458-a510-c750d0760102 tempest-ServerRescueNegativeTestJSON-289200803 tempest-ServerRescueNegativeTestJSON-289200803-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1183.740481] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-7da890ac-f4db-4458-a510-c750d0760102 tempest-ServerRescueNegativeTestJSON-289200803 tempest-ServerRescueNegativeTestJSON-289200803-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1183.779659] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-7da890ac-f4db-4458-a510-c750d0760102 tempest-ServerRescueNegativeTestJSON-289200803 tempest-ServerRescueNegativeTestJSON-289200803-project-member] [instance: a3891b54-2b32-468e-8933-403309c68eb2] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1183.780400] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-7da890ac-f4db-4458-a510-c750d0760102 tempest-ServerRescueNegativeTestJSON-289200803 tempest-ServerRescueNegativeTestJSON-289200803-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1183.780612] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-7da890ac-f4db-4458-a510-c750d0760102 tempest-ServerRescueNegativeTestJSON-289200803 tempest-ServerRescueNegativeTestJSON-289200803-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1183.780806] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-7da890ac-f4db-4458-a510-c750d0760102 tempest-ServerRescueNegativeTestJSON-289200803 tempest-ServerRescueNegativeTestJSON-289200803-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1183.783515] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-7da890ac-f4db-4458-a510-c750d0760102 tempest-ServerRescueNegativeTestJSON-289200803 tempest-ServerRescueNegativeTestJSON-289200803-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1183.783515] nova-conductor[53039]: Traceback (most recent call last): [ 1183.783515] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1183.783515] nova-conductor[53039]: return func(*args, **kwargs) [ 1183.783515] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1183.783515] nova-conductor[53039]: selections = self._select_destinations( [ 1183.783515] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1183.783515] nova-conductor[53039]: selections = self._schedule( [ 1183.783515] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1183.783515] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 1183.783515] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1183.783515] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 1183.783515] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1183.783515] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1183.784032] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-7da890ac-f4db-4458-a510-c750d0760102 tempest-ServerRescueNegativeTestJSON-289200803 tempest-ServerRescueNegativeTestJSON-289200803-project-member] [instance: a3891b54-2b32-468e-8933-403309c68eb2] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager [None req-dbe585a7-d83c-4fe8-9c93-0b817a21982b tempest-ServerRescueNegativeTestJSON-289200803 tempest-ServerRescueNegativeTestJSON-289200803-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1184.082316] nova-conductor[53039]: Traceback (most recent call last): [ 1184.082316] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1184.082316] nova-conductor[53039]: return func(*args, **kwargs) [ 1184.082316] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1184.082316] nova-conductor[53039]: selections = self._select_destinations( [ 1184.082316] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1184.082316] nova-conductor[53039]: selections = self._schedule( [ 1184.082316] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1184.082316] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 1184.082316] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1184.082316] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 1184.082316] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager [ 1184.082316] nova-conductor[53039]: ERROR nova.conductor.manager [ 1184.091240] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-dbe585a7-d83c-4fe8-9c93-0b817a21982b tempest-ServerRescueNegativeTestJSON-289200803 tempest-ServerRescueNegativeTestJSON-289200803-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1184.091467] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-dbe585a7-d83c-4fe8-9c93-0b817a21982b tempest-ServerRescueNegativeTestJSON-289200803 tempest-ServerRescueNegativeTestJSON-289200803-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1184.091636] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-dbe585a7-d83c-4fe8-9c93-0b817a21982b tempest-ServerRescueNegativeTestJSON-289200803 tempest-ServerRescueNegativeTestJSON-289200803-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1184.144520] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-dbe585a7-d83c-4fe8-9c93-0b817a21982b tempest-ServerRescueNegativeTestJSON-289200803 tempest-ServerRescueNegativeTestJSON-289200803-project-member] [instance: b7617a02-0451-4555-b24b-bce5103e741c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1184.145045] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-dbe585a7-d83c-4fe8-9c93-0b817a21982b tempest-ServerRescueNegativeTestJSON-289200803 tempest-ServerRescueNegativeTestJSON-289200803-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1184.145392] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-dbe585a7-d83c-4fe8-9c93-0b817a21982b tempest-ServerRescueNegativeTestJSON-289200803 tempest-ServerRescueNegativeTestJSON-289200803-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1184.145488] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-dbe585a7-d83c-4fe8-9c93-0b817a21982b tempest-ServerRescueNegativeTestJSON-289200803 tempest-ServerRescueNegativeTestJSON-289200803-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1184.148243] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-dbe585a7-d83c-4fe8-9c93-0b817a21982b tempest-ServerRescueNegativeTestJSON-289200803 tempest-ServerRescueNegativeTestJSON-289200803-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1184.148243] nova-conductor[53039]: Traceback (most recent call last): [ 1184.148243] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1184.148243] nova-conductor[53039]: return func(*args, **kwargs) [ 1184.148243] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1184.148243] nova-conductor[53039]: selections = self._select_destinations( [ 1184.148243] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1184.148243] nova-conductor[53039]: selections = self._schedule( [ 1184.148243] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1184.148243] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 1184.148243] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1184.148243] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 1184.148243] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1184.148243] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1184.149208] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-dbe585a7-d83c-4fe8-9c93-0b817a21982b tempest-ServerRescueNegativeTestJSON-289200803 tempest-ServerRescueNegativeTestJSON-289200803-project-member] [instance: b7617a02-0451-4555-b24b-bce5103e741c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1192.296497] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=53039) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 1192.308015] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1192.308015] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1192.308015] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1192.336482] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1192.336482] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1192.336482] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1192.336691] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1192.337574] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1192.337574] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1192.344514] nova-conductor[53039]: DEBUG nova.quota [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Getting quotas for project 07237bcd8b47450cae1f09b3c693038e. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 1192.346768] nova-conductor[53039]: DEBUG nova.quota [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Getting quotas for user 101ef53aa5c0412b8a7cd0abe6761419 and project 07237bcd8b47450cae1f09b3c693038e. Resources: {'instances', 'cores', 'ram'} {{(pid=53039) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 1192.352689] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=53039) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 1192.353251] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1192.353251] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1192.353430] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1192.356040] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1192.357020] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1192.357020] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1192.357020] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1192.369556] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Acquiring lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1192.369782] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1192.369953] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Lock "d0ae54bd-e0f8-43cb-838f-391ff3e34a72" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1226.589064] nova-conductor[53040]: DEBUG nova.db.main.api [None req-7f62efef-7994-4253-9212-0d5293f44bef tempest-ServersTestJSON-1437394991 tempest-ServersTestJSON-1437394991-project-member] Created instance_extra for b5ad6145-8bf0-4aed-951b-eb11dd87ed7d {{(pid=53040) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1276.158248] nova-conductor[53040]: ERROR nova.scheduler.utils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 0257c136-6f30-43ae-8f8d-e8f23d8328ef was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 1276.158820] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Rescheduling: True {{(pid=53040) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 1276.159058] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 0257c136-6f30-43ae-8f8d-e8f23d8328ef.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 0257c136-6f30-43ae-8f8d-e8f23d8328ef. [ 1276.159279] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 0257c136-6f30-43ae-8f8d-e8f23d8328ef. [ 1276.180580] nova-conductor[53040]: DEBUG nova.network.neutron [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] deallocate_for_instance() {{(pid=53040) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 1276.232843] nova-conductor[53040]: DEBUG nova.network.neutron [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Instance cache missing network info. {{(pid=53040) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 1276.239306] nova-conductor[53040]: DEBUG nova.network.neutron [None req-6351ee06-cea0-4ee6-921b-037b51068a0e tempest-FloatingIPsAssociationNegativeTestJSON-971461043 tempest-FloatingIPsAssociationNegativeTestJSON-971461043-project-member] [instance: 0257c136-6f30-43ae-8f8d-e8f23d8328ef] Updating instance_info_cache with network_info: [] {{(pid=53040) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1276.485014] nova-conductor[53039]: DEBUG nova.db.main.api [None req-bc3dd74e-b01b-42fa-9b0c-efa2ef8ce8e8 tempest-ServerShowV247Test-1963305943 tempest-ServerShowV247Test-1963305943-project-member] Created instance_extra for d97a55c5-f248-482a-9986-212e84bdd0b0 {{(pid=53039) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager [None req-645482c2-f528-4832-a327-ba7cbb5479c9 tempest-ServerAddressesTestJSON-831818212 tempest-ServerAddressesTestJSON-831818212-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1281.314964] nova-conductor[53040]: Traceback (most recent call last): [ 1281.314964] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1281.314964] nova-conductor[53040]: return func(*args, **kwargs) [ 1281.314964] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1281.314964] nova-conductor[53040]: selections = self._select_destinations( [ 1281.314964] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1281.314964] nova-conductor[53040]: selections = self._schedule( [ 1281.314964] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1281.314964] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 1281.314964] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1281.314964] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 1281.314964] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager result = self.transport._send( [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager raise result [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager selections = self._schedule( [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager [ 1281.314964] nova-conductor[53040]: ERROR nova.conductor.manager [ 1281.321741] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-645482c2-f528-4832-a327-ba7cbb5479c9 tempest-ServerAddressesTestJSON-831818212 tempest-ServerAddressesTestJSON-831818212-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1281.321967] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-645482c2-f528-4832-a327-ba7cbb5479c9 tempest-ServerAddressesTestJSON-831818212 tempest-ServerAddressesTestJSON-831818212-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1281.322162] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-645482c2-f528-4832-a327-ba7cbb5479c9 tempest-ServerAddressesTestJSON-831818212 tempest-ServerAddressesTestJSON-831818212-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1281.368653] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-645482c2-f528-4832-a327-ba7cbb5479c9 tempest-ServerAddressesTestJSON-831818212 tempest-ServerAddressesTestJSON-831818212-project-member] [instance: 20828633-5581-41da-ad0c-bf9f0b798042] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53040) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1281.369784] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-645482c2-f528-4832-a327-ba7cbb5479c9 tempest-ServerAddressesTestJSON-831818212 tempest-ServerAddressesTestJSON-831818212-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1281.369784] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-645482c2-f528-4832-a327-ba7cbb5479c9 tempest-ServerAddressesTestJSON-831818212 tempest-ServerAddressesTestJSON-831818212-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1281.369784] nova-conductor[53040]: DEBUG oslo_concurrency.lockutils [None req-645482c2-f528-4832-a327-ba7cbb5479c9 tempest-ServerAddressesTestJSON-831818212 tempest-ServerAddressesTestJSON-831818212-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53040) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1281.375984] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-645482c2-f528-4832-a327-ba7cbb5479c9 tempest-ServerAddressesTestJSON-831818212 tempest-ServerAddressesTestJSON-831818212-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1281.375984] nova-conductor[53040]: Traceback (most recent call last): [ 1281.375984] nova-conductor[53040]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1281.375984] nova-conductor[53040]: return func(*args, **kwargs) [ 1281.375984] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1281.375984] nova-conductor[53040]: selections = self._select_destinations( [ 1281.375984] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1281.375984] nova-conductor[53040]: selections = self._schedule( [ 1281.375984] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1281.375984] nova-conductor[53040]: self._ensure_sufficient_hosts( [ 1281.375984] nova-conductor[53040]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1281.375984] nova-conductor[53040]: raise exception.NoValidHost(reason=reason) [ 1281.375984] nova-conductor[53040]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1281.375984] nova-conductor[53040]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1281.375984] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-645482c2-f528-4832-a327-ba7cbb5479c9 tempest-ServerAddressesTestJSON-831818212 tempest-ServerAddressesTestJSON-831818212-project-member] [instance: 20828633-5581-41da-ad0c-bf9f0b798042] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager [None req-919b1e3f-2563-40a3-8902-f1b660e6d8f3 tempest-ServerMetadataTestJSON-301977506 tempest-ServerMetadataTestJSON-301977506-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1287.010654] nova-conductor[53039]: Traceback (most recent call last): [ 1287.010654] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1287.010654] nova-conductor[53039]: return func(*args, **kwargs) [ 1287.010654] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1287.010654] nova-conductor[53039]: selections = self._select_destinations( [ 1287.010654] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1287.010654] nova-conductor[53039]: selections = self._schedule( [ 1287.010654] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1287.010654] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 1287.010654] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1287.010654] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 1287.010654] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager result = self.transport._send( [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager raise result [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager selections = self._schedule( [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager [ 1287.010654] nova-conductor[53039]: ERROR nova.conductor.manager [ 1287.017388] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-919b1e3f-2563-40a3-8902-f1b660e6d8f3 tempest-ServerMetadataTestJSON-301977506 tempest-ServerMetadataTestJSON-301977506-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1287.017619] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-919b1e3f-2563-40a3-8902-f1b660e6d8f3 tempest-ServerMetadataTestJSON-301977506 tempest-ServerMetadataTestJSON-301977506-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1287.017797] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-919b1e3f-2563-40a3-8902-f1b660e6d8f3 tempest-ServerMetadataTestJSON-301977506 tempest-ServerMetadataTestJSON-301977506-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1287.056449] nova-conductor[53039]: DEBUG nova.conductor.manager [None req-919b1e3f-2563-40a3-8902-f1b660e6d8f3 tempest-ServerMetadataTestJSON-301977506 tempest-ServerMetadataTestJSON-301977506-project-member] [instance: 9b4f102f-ed35-4635-8568-f8f58c6697d3] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='f5dfd970-7a56-4489-873c-2c3b6fbd9fe9',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=53039) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1287.057142] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-919b1e3f-2563-40a3-8902-f1b660e6d8f3 tempest-ServerMetadataTestJSON-301977506 tempest-ServerMetadataTestJSON-301977506-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1287.057361] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-919b1e3f-2563-40a3-8902-f1b660e6d8f3 tempest-ServerMetadataTestJSON-301977506 tempest-ServerMetadataTestJSON-301977506-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1287.057539] nova-conductor[53039]: DEBUG oslo_concurrency.lockutils [None req-919b1e3f-2563-40a3-8902-f1b660e6d8f3 tempest-ServerMetadataTestJSON-301977506 tempest-ServerMetadataTestJSON-301977506-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=53039) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1287.060660] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-919b1e3f-2563-40a3-8902-f1b660e6d8f3 tempest-ServerMetadataTestJSON-301977506 tempest-ServerMetadataTestJSON-301977506-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1287.060660] nova-conductor[53039]: Traceback (most recent call last): [ 1287.060660] nova-conductor[53039]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1287.060660] nova-conductor[53039]: return func(*args, **kwargs) [ 1287.060660] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1287.060660] nova-conductor[53039]: selections = self._select_destinations( [ 1287.060660] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1287.060660] nova-conductor[53039]: selections = self._schedule( [ 1287.060660] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1287.060660] nova-conductor[53039]: self._ensure_sufficient_hosts( [ 1287.060660] nova-conductor[53039]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1287.060660] nova-conductor[53039]: raise exception.NoValidHost(reason=reason) [ 1287.060660] nova-conductor[53039]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1287.060660] nova-conductor[53039]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1287.061142] nova-conductor[53039]: WARNING nova.scheduler.utils [None req-919b1e3f-2563-40a3-8902-f1b660e6d8f3 tempest-ServerMetadataTestJSON-301977506 tempest-ServerMetadataTestJSON-301977506-project-member] [instance: 9b4f102f-ed35-4635-8568-f8f58c6697d3] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1322.807015] nova-conductor[53040]: DEBUG nova.db.main.api [None req-e1386e7d-41ed-4b89-977f-4611b40d6790 tempest-ServerGroupTestJSON-1525229369 tempest-ServerGroupTestJSON-1525229369-project-member] Created instance_extra for e924a9ab-71c1-4efe-a217-b036ec785dc8 {{(pid=53040) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1371.618261] nova-conductor[53040]: ERROR nova.scheduler.utils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance f114d70b-3524-4f1c-b1af-71ae3235d040 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 1371.619609] nova-conductor[53040]: DEBUG nova.conductor.manager [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Rescheduling: True {{(pid=53040) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 1371.619609] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance f114d70b-3524-4f1c-b1af-71ae3235d040.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance f114d70b-3524-4f1c-b1af-71ae3235d040. [ 1371.619609] nova-conductor[53040]: WARNING nova.scheduler.utils [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance f114d70b-3524-4f1c-b1af-71ae3235d040. [ 1371.636685] nova-conductor[53040]: DEBUG nova.network.neutron [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] deallocate_for_instance() {{(pid=53040) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 1371.653115] nova-conductor[53040]: DEBUG nova.network.neutron [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Instance cache missing network info. {{(pid=53040) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 1371.655699] nova-conductor[53040]: DEBUG nova.network.neutron [None req-29ed356a-f239-4acf-b1cb-4cb2453bd2c4 tempest-ServersNegativeTestMultiTenantJSON-459744605 tempest-ServersNegativeTestMultiTenantJSON-459744605-project-member] [instance: f114d70b-3524-4f1c-b1af-71ae3235d040] Updating instance_info_cache with network_info: [] {{(pid=53040) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}}